blob: f99a5d81fe7be7e3cfd7d45effba57ae13f936d6 [file] [log] [blame]
Bu Sun Kim65020912020-05-20 12:08:20 -07001<html><body>
2<style>
3
4body, h1, h2, h3, div, span, p, pre, a {
5 margin: 0;
6 padding: 0;
7 border: 0;
8 font-weight: inherit;
9 font-style: inherit;
10 font-size: 100%;
11 font-family: inherit;
12 vertical-align: baseline;
13}
14
15body {
16 font-size: 13px;
17 padding: 1em;
18}
19
20h1 {
21 font-size: 26px;
22 margin-bottom: 1em;
23}
24
25h2 {
26 font-size: 24px;
27 margin-bottom: 1em;
28}
29
30h3 {
31 font-size: 20px;
32 margin-bottom: 1em;
33 margin-top: 1em;
34}
35
36pre, code {
37 line-height: 1.5;
38 font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace;
39}
40
41pre {
42 margin-top: 0.5em;
43}
44
45h1, h2, h3, p {
46 font-family: Arial, sans serif;
47}
48
49h1, h2, h3 {
50 border-bottom: solid #CCC 1px;
51}
52
53.toc_element {
54 margin-top: 0.5em;
55}
56
57.firstline {
58 margin-left: 2 em;
59}
60
61.method {
62 margin-top: 1em;
63 border: solid 1px #CCC;
64 padding: 1em;
65 background: #EEE;
66}
67
68.details {
69 font-weight: bold;
70 font-size: 14px;
71}
72
73</style>
74
75<h1><a href="dialogflow_v2.html">Dialogflow API</a> . <a href="dialogflow_v2.projects.html">projects</a> . <a href="dialogflow_v2.projects.agent.html">agent</a> . <a href="dialogflow_v2.projects.agent.environments.html">environments</a> . <a href="dialogflow_v2.projects.agent.environments.users.html">users</a> . <a href="dialogflow_v2.projects.agent.environments.users.sessions.html">sessions</a></h1>
76<h2>Instance Methods</h2>
77<p class="toc_element">
78 <code><a href="dialogflow_v2.projects.agent.environments.users.sessions.contexts.html">contexts()</a></code>
79</p>
80<p class="firstline">Returns the contexts Resource.</p>
81
82<p class="toc_element">
83 <code><a href="dialogflow_v2.projects.agent.environments.users.sessions.entityTypes.html">entityTypes()</a></code>
84</p>
85<p class="firstline">Returns the entityTypes Resource.</p>
86
87<p class="toc_element">
88 <code><a href="#deleteContexts">deleteContexts(parent, x__xgafv=None)</a></code></p>
89<p class="firstline">Deletes all active contexts in the specified session.</p>
90<p class="toc_element">
91 <code><a href="#detectIntent">detectIntent(session, body=None, x__xgafv=None)</a></code></p>
92<p class="firstline">Processes a natural language query and returns structured, actionable data</p>
93<h3>Method Details</h3>
94<div class="method">
95 <code class="details" id="deleteContexts">deleteContexts(parent, x__xgafv=None)</code>
96 <pre>Deletes all active contexts in the specified session.
97
98Args:
99 parent: string, Required. The name of the session to delete all contexts from. Format:
100`projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;` or `projects/&lt;Project
101ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User ID&gt;/sessions/&lt;Session
102ID&gt;`.
103If `Environment ID` is not specified we assume default &#x27;draft&#x27; environment.
104If `User ID` is not specified, we assume default &#x27;-&#x27; user. (required)
105 x__xgafv: string, V1 error format.
106 Allowed values
107 1 - v1 error format
108 2 - v2 error format
109
110Returns:
111 An object of the form:
112
113 { # A generic empty message that you can re-use to avoid defining duplicated
114 # empty messages in your APIs. A typical example is to use it as the request
115 # or the response type of an API method. For instance:
116 #
117 # service Foo {
118 # rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty);
119 # }
120 #
121 # The JSON representation for `Empty` is empty JSON object `{}`.
122 }</pre>
123</div>
124
125<div class="method">
126 <code class="details" id="detectIntent">detectIntent(session, body=None, x__xgafv=None)</code>
127 <pre>Processes a natural language query and returns structured, actionable data
128as a result. This method is not idempotent, because it may cause contexts
129and session entity types to be updated, which in turn might affect
130results of future queries.
131
132Args:
133 session: string, Required. The name of the session this query is sent to. Format:
134`projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;`, or
135`projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
136ID&gt;/sessions/&lt;Session ID&gt;`. If `Environment ID` is not specified, we assume
137default &#x27;draft&#x27; environment. If `User ID` is not specified, we are using
138&quot;-&quot;. It&#x27;s up to the API caller to choose an appropriate `Session ID` and
139`User Id`. They can be a random number or some type of user and session
140identifiers (preferably hashed). The length of the `Session ID` and
141`User ID` must not exceed 36 characters. (required)
142 body: object, The request body.
143 The object takes the form of:
144
145{ # The request to detect user&#x27;s intent.
146 &quot;queryParams&quot;: { # Represents the parameters of the conversational query. # The parameters of this query.
147 &quot;payload&quot;: { # This field can be used to pass custom data to your webhook.
148 # Arbitrary JSON objects are supported.
149 # If supplied, the value is used to populate the
150 # `WebhookRequest.original_detect_intent_request.payload`
151 # field sent to your webhook.
152 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
153 },
154 &quot;geoLocation&quot;: { # An object representing a latitude/longitude pair. This is expressed as a pair # The geo location of this conversational query.
155 # of doubles representing degrees latitude and degrees longitude. Unless
156 # specified otherwise, this must conform to the
157 # &lt;a href=&quot;http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf&quot;&gt;WGS84
158 # standard&lt;/a&gt;. Values must be within normalized ranges.
159 &quot;latitude&quot;: 3.14, # The latitude in degrees. It must be in the range [-90.0, +90.0].
160 &quot;longitude&quot;: 3.14, # The longitude in degrees. It must be in the range [-180.0, +180.0].
161 },
162 &quot;resetContexts&quot;: True or False, # Specifies whether to delete all contexts in the current session
163 # before the new ones are activated.
164 &quot;contexts&quot;: [ # The collection of contexts to be activated before this query is
165 # executed.
166 { # Represents a context.
167 &quot;lifespanCount&quot;: 42, # Optional. The number of conversational query requests after which the
168 # context expires. The default is `0`. If set to `0`, the context expires
169 # immediately. Contexts expire automatically after 20 minutes if there
170 # are no matching queries.
171 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the context. Format:
172 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`,
173 # or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
174 # ID&gt;/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`.
175 #
176 # The `Context ID` is always converted to lowercase, may only contain
177 # characters in a-zA-Z0-9_-% and may be at most 250 bytes long.
178 #
179 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
180 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
181 #
182 # The following context names are reserved for internal use by Dialogflow.
183 # You should not use these contexts or create contexts with these names:
184 #
185 # * `__system_counters__`
186 # * `*_id_dialog_context`
187 # * `*_dialog_params_size`
188 &quot;parameters&quot;: { # Optional. The collection of parameters associated with this context.
189 #
190 # Depending on your protocol or client library language, this is a
191 # map, associative array, symbol table, dictionary, or JSON object
192 # composed of a collection of (MapKey, MapValue) pairs:
193 #
194 # - MapKey type: string
195 # - MapKey value: parameter name
196 # - MapValue type:
197 # - If parameter&#x27;s entity type is a composite entity: map
198 # - Else: string or number, depending on parameter value type
199 # - MapValue value:
200 # - If parameter&#x27;s entity type is a composite entity:
201 # map from composite entity property names to property values
202 # - Else: parameter value
203 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
204 },
205 },
206 ],
207 &quot;sentimentAnalysisRequestConfig&quot;: { # Configures the types of sentiment analysis to perform. # Configures the type of sentiment analysis to perform. If not
208 # provided, sentiment analysis is not performed.
209 &quot;analyzeQueryTextSentiment&quot;: True or False, # Instructs the service to perform sentiment analysis on
210 # `query_text`. If not provided, sentiment analysis is not performed on
211 # `query_text`.
212 },
213 &quot;timeZone&quot;: &quot;A String&quot;, # The time zone of this conversational query from the
214 # [time zone database](https://www.iana.org/time-zones), e.g.,
215 # America/New_York, Europe/Paris. If not provided, the time zone specified in
216 # agent settings is used.
217 &quot;sessionEntityTypes&quot;: [ # Additional session entity types to replace or extend developer
218 # entity types with. The entity synonyms apply to all languages and persist
219 # for the session of this query.
220 { # Represents a session entity type.
221 #
222 # Extends or replaces a custom entity type at the user session level (we
223 # refer to the entity types defined at the agent level as &quot;custom entity
224 # types&quot;).
225 #
226 # Note: session entity types apply to all queries, regardless of the language.
227 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of this session entity type. Format:
228 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/entityTypes/&lt;Entity Type
229 # Display Name&gt;`, or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment
230 # ID&gt;/users/&lt;User ID&gt;/sessions/&lt;Session ID&gt;/entityTypes/&lt;Entity Type Display
231 # Name&gt;`.
232 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
233 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
234 #
235 # `&lt;Entity Type Display Name&gt;` must be the display name of an existing entity
236 # type in the same agent that will be overridden or supplemented.
237 &quot;entityOverrideMode&quot;: &quot;A String&quot;, # Required. Indicates whether the additional data should override or
238 # supplement the custom entity type definition.
239 &quot;entities&quot;: [ # Required. The collection of entities associated with this session entity
240 # type.
241 { # An **entity entry** for an associated entity type.
242 &quot;value&quot;: &quot;A String&quot;, # Required. The primary value associated with this entity entry.
243 # For example, if the entity type is *vegetable*, the value could be
244 # *scallions*.
245 #
246 # For `KIND_MAP` entity types:
247 #
248 # * A reference value to be used in place of synonyms.
249 #
250 # For `KIND_LIST` entity types:
251 #
252 # * A string that can contain references to other entity types (with or
253 # without aliases).
254 &quot;synonyms&quot;: [ # Required. A collection of value synonyms. For example, if the entity type
255 # is *vegetable*, and `value` is *scallions*, a synonym could be *green
256 # onions*.
257 #
258 # For `KIND_LIST` entity types:
259 #
260 # * This collection must contain exactly one synonym equal to `value`.
261 &quot;A String&quot;,
262 ],
263 },
264 ],
265 },
266 ],
267 },
268 &quot;outputAudioConfig&quot;: { # Instructs the speech synthesizer on how to generate the output audio content. # Instructs the speech synthesizer how to generate the output
269 # audio. If this field is not set and agent-level speech synthesizer is not
270 # configured, no output audio is generated.
271 # If this audio config is supplied in a request, it overrides all existing
272 # text-to-speech settings applied to the agent.
273 &quot;sampleRateHertz&quot;: 42, # The synthesis sample rate (in hertz) for this audio. If not
274 # provided, then the synthesizer will use the default sample rate based on
275 # the audio encoding. If this is different from the voice&#x27;s natural sample
276 # rate, then the synthesizer will honor this request by converting to the
277 # desired sample rate (which might result in worse audio quality).
278 &quot;audioEncoding&quot;: &quot;A String&quot;, # Required. Audio encoding of the synthesized audio content.
279 &quot;synthesizeSpeechConfig&quot;: { # Configuration of how speech should be synthesized. # Configuration of how speech should be synthesized.
280 &quot;effectsProfileId&quot;: [ # Optional. An identifier which selects &#x27;audio effects&#x27; profiles that are
281 # applied on (post synthesized) text to speech. Effects are applied on top of
282 # each other in the order they are given.
283 &quot;A String&quot;,
284 ],
285 &quot;volumeGainDb&quot;: 3.14, # Optional. Volume gain (in dB) of the normal native volume supported by the
286 # specific voice, in the range [-96.0, 16.0]. If unset, or set to a value of
287 # 0.0 (dB), will play at normal native signal amplitude. A value of -6.0 (dB)
288 # will play at approximately half the amplitude of the normal native signal
289 # amplitude. A value of +6.0 (dB) will play at approximately twice the
290 # amplitude of the normal native signal amplitude. We strongly recommend not
291 # to exceed +10 (dB) as there&#x27;s usually no effective increase in loudness for
292 # any value greater than that.
293 &quot;pitch&quot;: 3.14, # Optional. Speaking pitch, in the range [-20.0, 20.0]. 20 means increase 20
294 # semitones from the original pitch. -20 means decrease 20 semitones from the
295 # original pitch.
296 &quot;voice&quot;: { # Description of which voice to use for speech synthesis. # Optional. The desired voice of the synthesized audio.
297 &quot;name&quot;: &quot;A String&quot;, # Optional. The name of the voice. If not set, the service will choose a
298 # voice based on the other parameters such as language_code and
299 # ssml_gender.
300 &quot;ssmlGender&quot;: &quot;A String&quot;, # Optional. The preferred gender of the voice. If not set, the service will
301 # choose a voice based on the other parameters such as language_code and
302 # name. Note that this is only a preference, not requirement. If a
303 # voice of the appropriate gender is not available, the synthesizer should
304 # substitute a voice with a different gender rather than failing the request.
305 },
306 &quot;speakingRate&quot;: 3.14, # Optional. Speaking rate/speed, in the range [0.25, 4.0]. 1.0 is the normal
307 # native speed supported by the specific voice. 2.0 is twice as fast, and
308 # 0.5 is half as fast. If unset(0.0), defaults to the native 1.0 speed. Any
309 # other values &lt; 0.25 or &gt; 4.0 will return an error.
310 },
311 },
312 &quot;inputAudio&quot;: &quot;A String&quot;, # The natural language speech audio to be processed. This field
313 # should be populated iff `query_input` is set to an input audio config.
314 # A single request can contain up to 1 minute of speech audio data.
315 &quot;outputAudioConfigMask&quot;: &quot;A String&quot;, # Mask for output_audio_config indicating which settings in this
316 # request-level config should override speech synthesizer settings defined at
317 # agent-level.
318 #
319 # If unspecified or empty, output_audio_config replaces the agent-level
320 # config in its entirety.
321 &quot;queryInput&quot;: { # Represents the query input. It can contain either: # Required. The input specification. It can be set to:
322 #
323 # 1. an audio config
324 # which instructs the speech recognizer how to process the speech audio,
325 #
326 # 2. a conversational query in the form of text, or
327 #
328 # 3. an event that specifies which intent to trigger.
329 #
330 # 1. An audio config which
331 # instructs the speech recognizer how to process the speech audio.
332 #
333 # 2. A conversational query in the form of text,.
334 #
335 # 3. An event that specifies which intent to trigger.
336 &quot;audioConfig&quot;: { # Instructs the speech recognizer how to process the audio content. # Instructs the speech recognizer how to process the speech audio.
337 &quot;singleUtterance&quot;: True or False, # If `false` (default), recognition does not cease until the
338 # client closes the stream.
339 # If `true`, the recognizer will detect a single spoken utterance in input
340 # audio. Recognition ceases when it detects the audio&#x27;s voice has
341 # stopped or paused. In this case, once a detected intent is received, the
342 # client should close the stream and start a new request with a new stream as
343 # needed.
344 # Note: This setting is relevant only for streaming methods.
345 # Note: When specified, InputAudioConfig.single_utterance takes precedence
346 # over StreamingDetectIntentRequest.single_utterance.
347 &quot;languageCode&quot;: &quot;A String&quot;, # Required. The language of the supplied audio. Dialogflow does not do
348 # translations. See [Language
349 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
350 # for a list of the currently supported language codes. Note that queries in
351 # the same session do not necessarily need to specify the same language.
352 &quot;phraseHints&quot;: [ # A list of strings containing words and phrases that the speech
353 # recognizer should recognize with higher likelihood.
354 #
355 # See [the Cloud Speech
356 # documentation](https://cloud.google.com/speech-to-text/docs/basics#phrase-hints)
357 # for more details.
358 #
359 # This field is deprecated. Please use [speech_contexts]() instead. If you
360 # specify both [phrase_hints]() and [speech_contexts](), Dialogflow will
361 # treat the [phrase_hints]() as a single additional [SpeechContext]().
362 &quot;A String&quot;,
363 ],
364 &quot;speechContexts&quot;: [ # Context information to assist speech recognition.
365 #
366 # See [the Cloud Speech
367 # documentation](https://cloud.google.com/speech-to-text/docs/basics#phrase-hints)
368 # for more details.
369 { # Hints for the speech recognizer to help with recognition in a specific
370 # conversation state.
371 &quot;boost&quot;: 3.14, # Optional. Boost for this context compared to other contexts:
372 #
373 # * If the boost is positive, Dialogflow will increase the probability that
374 # the phrases in this context are recognized over similar sounding phrases.
375 # * If the boost is unspecified or non-positive, Dialogflow will not apply
376 # any boost.
377 #
378 # Dialogflow recommends that you use boosts in the range (0, 20] and that you
379 # find a value that fits your use case with binary search.
380 &quot;phrases&quot;: [ # Optional. A list of strings containing words and phrases that the speech
381 # recognizer should recognize with higher likelihood.
382 #
383 # This list can be used to:
384 # * improve accuracy for words and phrases you expect the user to say,
385 # e.g. typical commands for your Dialogflow agent
386 # * add additional words to the speech recognizer vocabulary
387 # * ...
388 #
389 # See the [Cloud Speech
390 # documentation](https://cloud.google.com/speech-to-text/quotas) for usage
391 # limits.
392 &quot;A String&quot;,
393 ],
394 },
395 ],
396 &quot;enableWordInfo&quot;: True or False, # If `true`, Dialogflow returns SpeechWordInfo in
397 # StreamingRecognitionResult with information about the recognized speech
398 # words, e.g. start and end time offsets. If false or unspecified, Speech
399 # doesn&#x27;t return any word-level information.
400 &quot;model&quot;: &quot;A String&quot;, # Which Speech model to select for the given request. Select the
401 # model best suited to your domain to get best results. If a model is not
402 # explicitly specified, then we auto-select a model based on the parameters
403 # in the InputAudioConfig.
404 # If enhanced speech model is enabled for the agent and an enhanced
405 # version of the specified model for the language does not exist, then the
406 # speech is recognized using the standard version of the specified model.
407 # Refer to
408 # [Cloud Speech API
409 # documentation](https://cloud.google.com/speech-to-text/docs/basics#select-model)
410 # for more details.
411 &quot;sampleRateHertz&quot;: 42, # Required. Sample rate (in Hertz) of the audio content sent in the query.
412 # Refer to
413 # [Cloud Speech API
414 # documentation](https://cloud.google.com/speech-to-text/docs/basics) for
415 # more details.
416 &quot;modelVariant&quot;: &quot;A String&quot;, # Which variant of the Speech model to use.
417 &quot;audioEncoding&quot;: &quot;A String&quot;, # Required. Audio encoding of the audio content to process.
418 },
419 &quot;event&quot;: { # Events allow for matching intents by event name instead of the natural # The event to be processed.
420 # language input. For instance, input `&lt;event: { name: &quot;welcome_event&quot;,
421 # parameters: { name: &quot;Sam&quot; } }&gt;` can trigger a personalized welcome response.
422 # The parameter `name` may be used by the agent in the response:
423 # `&quot;Hello #welcome_event.name! What can I do for you today?&quot;`.
424 &quot;languageCode&quot;: &quot;A String&quot;, # Required. The language of this query. See [Language
425 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
426 # for a list of the currently supported language codes. Note that queries in
427 # the same session do not necessarily need to specify the same language.
428 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the event.
429 &quot;parameters&quot;: { # The collection of parameters associated with the event.
430 #
431 # Depending on your protocol or client library language, this is a
432 # map, associative array, symbol table, dictionary, or JSON object
433 # composed of a collection of (MapKey, MapValue) pairs:
434 #
435 # - MapKey type: string
436 # - MapKey value: parameter name
437 # - MapValue type:
438 # - If parameter&#x27;s entity type is a composite entity: map
439 # - Else: string or number, depending on parameter value type
440 # - MapValue value:
441 # - If parameter&#x27;s entity type is a composite entity:
442 # map from composite entity property names to property values
443 # - Else: parameter value
444 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
445 },
446 },
447 &quot;text&quot;: { # Represents the natural language text to be processed. # The natural language text to be processed.
448 &quot;text&quot;: &quot;A String&quot;, # Required. The UTF-8 encoded natural language text to be processed.
449 # Text length must not exceed 256 characters.
450 &quot;languageCode&quot;: &quot;A String&quot;, # Required. The language of this conversational query. See [Language
451 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
452 # for a list of the currently supported language codes. Note that queries in
453 # the same session do not necessarily need to specify the same language.
454 },
455 },
456 }
457
458 x__xgafv: string, V1 error format.
459 Allowed values
460 1 - v1 error format
461 2 - v2 error format
462
463Returns:
464 An object of the form:
465
466 { # The message returned from the DetectIntent method.
467 &quot;outputAudioConfig&quot;: { # Instructs the speech synthesizer on how to generate the output audio content. # The config used by the speech synthesizer to generate the output audio.
468 # If this audio config is supplied in a request, it overrides all existing
469 # text-to-speech settings applied to the agent.
470 &quot;sampleRateHertz&quot;: 42, # The synthesis sample rate (in hertz) for this audio. If not
471 # provided, then the synthesizer will use the default sample rate based on
472 # the audio encoding. If this is different from the voice&#x27;s natural sample
473 # rate, then the synthesizer will honor this request by converting to the
474 # desired sample rate (which might result in worse audio quality).
475 &quot;audioEncoding&quot;: &quot;A String&quot;, # Required. Audio encoding of the synthesized audio content.
476 &quot;synthesizeSpeechConfig&quot;: { # Configuration of how speech should be synthesized. # Configuration of how speech should be synthesized.
477 &quot;effectsProfileId&quot;: [ # Optional. An identifier which selects &#x27;audio effects&#x27; profiles that are
478 # applied on (post synthesized) text to speech. Effects are applied on top of
479 # each other in the order they are given.
480 &quot;A String&quot;,
481 ],
482 &quot;volumeGainDb&quot;: 3.14, # Optional. Volume gain (in dB) of the normal native volume supported by the
483 # specific voice, in the range [-96.0, 16.0]. If unset, or set to a value of
484 # 0.0 (dB), will play at normal native signal amplitude. A value of -6.0 (dB)
485 # will play at approximately half the amplitude of the normal native signal
486 # amplitude. A value of +6.0 (dB) will play at approximately twice the
487 # amplitude of the normal native signal amplitude. We strongly recommend not
488 # to exceed +10 (dB) as there&#x27;s usually no effective increase in loudness for
489 # any value greater than that.
490 &quot;pitch&quot;: 3.14, # Optional. Speaking pitch, in the range [-20.0, 20.0]. 20 means increase 20
491 # semitones from the original pitch. -20 means decrease 20 semitones from the
492 # original pitch.
493 &quot;voice&quot;: { # Description of which voice to use for speech synthesis. # Optional. The desired voice of the synthesized audio.
494 &quot;name&quot;: &quot;A String&quot;, # Optional. The name of the voice. If not set, the service will choose a
495 # voice based on the other parameters such as language_code and
496 # ssml_gender.
497 &quot;ssmlGender&quot;: &quot;A String&quot;, # Optional. The preferred gender of the voice. If not set, the service will
498 # choose a voice based on the other parameters such as language_code and
499 # name. Note that this is only a preference, not requirement. If a
500 # voice of the appropriate gender is not available, the synthesizer should
501 # substitute a voice with a different gender rather than failing the request.
502 },
503 &quot;speakingRate&quot;: 3.14, # Optional. Speaking rate/speed, in the range [0.25, 4.0]. 1.0 is the normal
504 # native speed supported by the specific voice. 2.0 is twice as fast, and
505 # 0.5 is half as fast. If unset(0.0), defaults to the native 1.0 speed. Any
506 # other values &lt; 0.25 or &gt; 4.0 will return an error.
507 },
508 },
509 &quot;queryResult&quot;: { # Represents the result of conversational query or event processing. # The selected results of the conversational query or event processing.
510 # See `alternative_query_results` for additional potential results.
511 &quot;fulfillmentText&quot;: &quot;A String&quot;, # The text to be pronounced to the user or shown on the screen.
512 # Note: This is a legacy field, `fulfillment_messages` should be preferred.
513 &quot;parameters&quot;: { # The collection of extracted parameters.
514 #
515 # Depending on your protocol or client library language, this is a
516 # map, associative array, symbol table, dictionary, or JSON object
517 # composed of a collection of (MapKey, MapValue) pairs:
518 #
519 # - MapKey type: string
520 # - MapKey value: parameter name
521 # - MapValue type:
522 # - If parameter&#x27;s entity type is a composite entity: map
523 # - Else: string or number, depending on parameter value type
524 # - MapValue value:
525 # - If parameter&#x27;s entity type is a composite entity:
526 # map from composite entity property names to property values
527 # - Else: parameter value
528 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
529 },
530 &quot;sentimentAnalysisResult&quot;: { # The result of sentiment analysis as configured by # The sentiment analysis result, which depends on the
531 # `sentiment_analysis_request_config` specified in the request.
532 # `sentiment_analysis_request_config`.
533 &quot;queryTextSentiment&quot;: { # The sentiment, such as positive/negative feeling or association, for a unit # The sentiment analysis result for `query_text`.
534 # of analysis, such as the query text.
535 &quot;score&quot;: 3.14, # Sentiment score between -1.0 (negative sentiment) and 1.0 (positive
536 # sentiment).
537 &quot;magnitude&quot;: 3.14, # A non-negative number in the [0, +inf) range, which represents the absolute
538 # magnitude of sentiment, regardless of score (positive or negative).
539 },
540 },
541 &quot;intentDetectionConfidence&quot;: 3.14, # The intent detection confidence. Values range from 0.0
542 # (completely uncertain) to 1.0 (completely certain).
543 # This value is for informational purpose only and is only used to
544 # help match the best intent within the classification threshold.
545 # This value may change for the same end-user expression at any time due to a
546 # model retraining or change in implementation.
547 # If there are `multiple knowledge_answers` messages, this value is set to
548 # the greatest `knowledgeAnswers.match_confidence` value in the list.
549 &quot;allRequiredParamsPresent&quot;: True or False, # This field is set to:
550 #
551 # - `false` if the matched intent has required parameters and not all of
552 # the required parameter values have been collected.
553 # - `true` if all required parameter values have been collected, or if the
554 # matched intent doesn&#x27;t contain any required parameters.
555 &quot;queryText&quot;: &quot;A String&quot;, # The original conversational query text:
556 #
557 # - If natural language text was provided as input, `query_text` contains
558 # a copy of the input.
559 # - If natural language speech audio was provided as input, `query_text`
560 # contains the speech recognition result. If speech recognizer produced
561 # multiple alternatives, a particular one is picked.
562 # - If automatic spell correction is enabled, `query_text` will contain the
563 # corrected user input.
564 &quot;speechRecognitionConfidence&quot;: 3.14, # The Speech recognition confidence between 0.0 and 1.0. A higher number
565 # indicates an estimated greater likelihood that the recognized words are
566 # correct. The default of 0.0 is a sentinel value indicating that confidence
567 # was not set.
568 #
569 # This field is not guaranteed to be accurate or set. In particular this
570 # field isn&#x27;t set for StreamingDetectIntent since the streaming endpoint has
571 # separate confidence estimates per portion of the audio in
572 # StreamingRecognitionResult.
573 &quot;diagnosticInfo&quot;: { # Free-form diagnostic information for the associated detect intent request.
574 # The fields of this data can change without notice, so you should not write
575 # code that depends on its structure.
576 # The data may contain:
577 #
578 # - webhook call latency
579 # - webhook errors
580 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
581 },
582 &quot;intent&quot;: { # Represents an intent. # The intent that matched the conversational query. Some, not
583 # all fields are filled in this message, including but not limited to:
584 # `name`, `display_name`, `end_interaction` and `is_fallback`.
585 # Intents convert a number of user expressions or patterns into an action. An
586 # action is an extraction of a user command or sentence semantics.
587 &quot;events&quot;: [ # Optional. The collection of event names that trigger the intent.
588 # If the collection of input contexts is not empty, all of the contexts must
589 # be present in the active user session for an event to trigger this intent.
590 # Event names are limited to 150 characters.
591 &quot;A String&quot;,
592 ],
593 &quot;parentFollowupIntentName&quot;: &quot;A String&quot;, # Read-only after creation. The unique identifier of the parent intent in the
594 # chain of followup intents. You can set this field when creating an intent,
595 # for example with CreateIntent or
596 # BatchUpdateIntents, in order to make this
597 # intent a followup intent.
598 #
599 # It identifies the parent followup intent.
600 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
601 &quot;priority&quot;: 42, # Optional. The priority of this intent. Higher numbers represent higher
602 # priorities.
603 #
604 # - If the supplied value is unspecified or 0, the service
605 # translates the value to 500,000, which corresponds to the
606 # `Normal` priority in the console.
607 # - If the supplied value is negative, the intent is ignored
608 # in runtime detect intent requests.
609 &quot;outputContexts&quot;: [ # Optional. The collection of contexts that are activated when the intent
610 # is matched. Context messages in this collection should not set the
611 # parameters field. Setting the `lifespan_count` to 0 will reset the context
612 # when the intent is matched.
613 # Format: `projects/&lt;Project ID&gt;/agent/sessions/-/contexts/&lt;Context ID&gt;`.
614 { # Represents a context.
615 &quot;lifespanCount&quot;: 42, # Optional. The number of conversational query requests after which the
616 # context expires. The default is `0`. If set to `0`, the context expires
617 # immediately. Contexts expire automatically after 20 minutes if there
618 # are no matching queries.
619 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the context. Format:
620 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`,
621 # or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
622 # ID&gt;/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`.
623 #
624 # The `Context ID` is always converted to lowercase, may only contain
625 # characters in a-zA-Z0-9_-% and may be at most 250 bytes long.
626 #
627 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
628 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
629 #
630 # The following context names are reserved for internal use by Dialogflow.
631 # You should not use these contexts or create contexts with these names:
632 #
633 # * `__system_counters__`
634 # * `*_id_dialog_context`
635 # * `*_dialog_params_size`
636 &quot;parameters&quot;: { # Optional. The collection of parameters associated with this context.
637 #
638 # Depending on your protocol or client library language, this is a
639 # map, associative array, symbol table, dictionary, or JSON object
640 # composed of a collection of (MapKey, MapValue) pairs:
641 #
642 # - MapKey type: string
643 # - MapKey value: parameter name
644 # - MapValue type:
645 # - If parameter&#x27;s entity type is a composite entity: map
646 # - Else: string or number, depending on parameter value type
647 # - MapValue value:
648 # - If parameter&#x27;s entity type is a composite entity:
649 # map from composite entity property names to property values
650 # - Else: parameter value
651 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
652 },
653 },
654 ],
655 &quot;defaultResponsePlatforms&quot;: [ # Optional. The list of platforms for which the first responses will be
656 # copied from the messages in PLATFORM_UNSPECIFIED (i.e. default platform).
657 &quot;A String&quot;,
658 ],
659 &quot;messages&quot;: [ # Optional. The collection of rich messages corresponding to the
660 # `Response` field in the Dialogflow console.
661 { # A rich response message.
662 # Corresponds to the intent `Response` field in the Dialogflow console.
663 # For more information, see
664 # [Rich response
665 # messages](https://cloud.google.com/dialogflow/docs/intents-rich-messages).
666 &quot;listSelect&quot;: { # The card for presenting a list of options to select from. # The list card response for Actions on Google.
667 &quot;title&quot;: &quot;A String&quot;, # Optional. The overall title of the list.
668 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle of the list.
669 &quot;items&quot;: [ # Required. List items.
670 { # An item in the list.
671 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the list item.
672 &quot;image&quot;: { # The image response message. # Optional. The image to display.
673 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
674 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
675 # e.g., screen readers.
676 },
677 &quot;description&quot;: &quot;A String&quot;, # Optional. The main text describing the item.
678 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional information about this option.
679 # dialog.
680 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
681 # item in dialog.
682 &quot;A String&quot;,
683 ],
684 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
685 # response is given.
686 },
687 },
688 ],
689 },
690 &quot;quickReplies&quot;: { # The quick replies response message. # The quick replies response.
691 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the collection of quick replies.
692 &quot;quickReplies&quot;: [ # Optional. The collection of quick replies.
693 &quot;A String&quot;,
694 ],
695 },
696 &quot;card&quot;: { # The card response message. # The card response.
697 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
698 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
699 { # Contains information about a button.
700 &quot;text&quot;: &quot;A String&quot;, # Optional. The text to show on the button.
701 &quot;postback&quot;: &quot;A String&quot;, # Optional. The text to send back to the Dialogflow API or a URI to
702 # open.
703 },
704 ],
705 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
706 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file for the card.
707 },
708 &quot;basicCard&quot;: { # The basic card message. Useful for displaying information. # The basic card response for Actions on Google.
709 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
710 &quot;image&quot;: { # The image response message. # Optional. The image for the card.
711 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
712 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
713 # e.g., screen readers.
714 },
715 &quot;formattedText&quot;: &quot;A String&quot;, # Required, unless image is present. The body text of the card.
716 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
717 { # The button object that appears at the bottom of a card.
718 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
719 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
720 },
721 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
722 },
723 ],
724 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
725 },
726 &quot;tableCard&quot;: { # Table card for Actions on Google. # Table card for Actions on Google.
727 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the card.
728 &quot;rows&quot;: [ # Optional. Rows in this table of data.
729 { # Row of TableCard.
730 &quot;dividerAfter&quot;: True or False, # Optional. Whether to add a visual divider after this row.
731 &quot;cells&quot;: [ # Optional. List of cells that make up this row.
732 { # Cell of TableCardRow.
733 &quot;text&quot;: &quot;A String&quot;, # Required. Text in this cell.
734 },
735 ],
736 },
737 ],
738 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle to the title.
739 &quot;columnProperties&quot;: [ # Optional. Display properties for the columns in this table.
740 { # Column properties for TableCard.
741 &quot;header&quot;: &quot;A String&quot;, # Required. Column heading.
742 &quot;horizontalAlignment&quot;: &quot;A String&quot;, # Optional. Defines text alignment for all cells in this column.
743 },
744 ],
745 &quot;image&quot;: { # The image response message. # Optional. Image which should be displayed on the card.
746 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
747 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
748 # e.g., screen readers.
749 },
750 &quot;buttons&quot;: [ # Optional. List of buttons for the card.
751 { # The button object that appears at the bottom of a card.
752 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
753 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
754 },
755 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
756 },
757 ],
758 },
759 &quot;carouselSelect&quot;: { # The card for presenting a carousel of options to select from. # The carousel card response for Actions on Google.
760 &quot;items&quot;: [ # Required. Carousel items.
761 { # An item in the carousel.
762 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item.
763 &quot;image&quot;: { # The image response message. # Optional. The image to display.
764 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
765 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
766 # e.g., screen readers.
767 },
768 &quot;description&quot;: &quot;A String&quot;, # Optional. The body text of the card.
769 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional info about the option item.
770 # dialog.
771 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
772 # item in dialog.
773 &quot;A String&quot;,
774 ],
775 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
776 # response is given.
777 },
778 },
779 ],
780 },
781 &quot;linkOutSuggestion&quot;: { # The suggestion chip message that allows the user to jump out to the app # The link out suggestion chip for Actions on Google.
782 # or website associated with this agent.
783 &quot;destinationName&quot;: &quot;A String&quot;, # Required. The name of the app or site this chip is linking to.
784 &quot;uri&quot;: &quot;A String&quot;, # Required. The URI of the app or site to open when the user taps the
785 # suggestion chip.
786 },
787 &quot;browseCarouselCard&quot;: { # Browse Carousel Card for Actions on Google. # Browse carousel card for Actions on Google.
788 # https://developers.google.com/actions/assistant/responses#browsing_carousel
789 &quot;items&quot;: [ # Required. List of items in the Browse Carousel Card. Minimum of two
790 # items, maximum of ten.
791 { # Browsing carousel tile
792 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of the carousel item. Maximum of four lines of
793 # text.
794 &quot;openUriAction&quot;: { # Actions on Google action to open a given url. # Required. Action to present to the user.
795 &quot;urlTypeHint&quot;: &quot;A String&quot;, # Optional. Specifies the type of viewer that is used when opening
796 # the URL. Defaults to opening via web browser.
797 &quot;url&quot;: &quot;A String&quot;, # Required. URL
798 },
799 &quot;footer&quot;: &quot;A String&quot;, # Optional. Text that appears at the bottom of the Browse Carousel
800 # Card. Maximum of one line of text.
801 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item. Maximum of two lines of text.
802 &quot;image&quot;: { # The image response message. # Optional. Hero image for the carousel item.
803 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
804 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
805 # e.g., screen readers.
806 },
807 },
808 ],
809 &quot;imageDisplayOptions&quot;: &quot;A String&quot;, # Optional. Settings for displaying the image. Applies to every image in
810 # items.
811 },
812 &quot;simpleResponses&quot;: { # The collection of simple response candidates. # The voice and text-only responses for Actions on Google.
813 # This message in `QueryResult.fulfillment_messages` and
814 # `WebhookResponse.fulfillment_messages` should contain only one
815 # `SimpleResponse`.
816 &quot;simpleResponses&quot;: [ # Required. The list of simple responses.
817 { # The simple response message containing speech or text.
818 &quot;displayText&quot;: &quot;A String&quot;, # Optional. The text to display.
819 &quot;textToSpeech&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. The plain text of the
820 # speech output. Mutually exclusive with ssml.
821 &quot;ssml&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. Structured spoken
822 # response to the user in the SSML format. Mutually exclusive with
823 # text_to_speech.
824 },
825 ],
826 },
827 &quot;mediaContent&quot;: { # The media content card for Actions on Google. # The media content card for Actions on Google.
828 &quot;mediaType&quot;: &quot;A String&quot;, # Optional. What type of media is the content (ie &quot;audio&quot;).
829 &quot;mediaObjects&quot;: [ # Required. List of media objects.
830 { # Response media object for media content card.
831 &quot;icon&quot;: { # The image response message. # Optional. Icon to display above media content.
832 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
833 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
834 # e.g., screen readers.
835 },
836 &quot;largeImage&quot;: { # The image response message. # Optional. Image to display above media content.
837 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
838 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
839 # e.g., screen readers.
840 },
841 &quot;name&quot;: &quot;A String&quot;, # Required. Name of media card.
842 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of media card.
843 &quot;contentUrl&quot;: &quot;A String&quot;, # Required. Url where the media is stored.
844 },
845 ],
846 },
847 &quot;image&quot;: { # The image response message. # The image response.
848 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
849 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
850 # e.g., screen readers.
851 },
852 &quot;payload&quot;: { # A custom platform-specific response.
853 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
854 },
855 &quot;text&quot;: { # The text response message. # The text response.
856 &quot;text&quot;: [ # Optional. The collection of the agent&#x27;s responses.
857 &quot;A String&quot;,
858 ],
859 },
860 &quot;suggestions&quot;: { # The collection of suggestions. # The suggestion chips for Actions on Google.
861 &quot;suggestions&quot;: [ # Required. The list of suggested replies.
862 { # The suggestion chip message that the user can tap to quickly post a reply
863 # to the conversation.
864 &quot;title&quot;: &quot;A String&quot;, # Required. The text shown the in the suggestion chip.
865 },
866 ],
867 },
868 &quot;platform&quot;: &quot;A String&quot;, # Optional. The platform that this message is intended for.
869 },
870 ],
871 &quot;name&quot;: &quot;A String&quot;, # Optional. The unique identifier of this intent.
872 # Required for Intents.UpdateIntent and Intents.BatchUpdateIntents
873 # methods.
874 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
875 &quot;action&quot;: &quot;A String&quot;, # Optional. The name of the action associated with the intent.
876 # Note: The action name must not contain whitespaces.
877 &quot;inputContextNames&quot;: [ # Optional. The list of context names required for this intent to be
878 # triggered.
879 # Format: `projects/&lt;Project ID&gt;/agent/sessions/-/contexts/&lt;Context ID&gt;`.
880 &quot;A String&quot;,
881 ],
882 &quot;webhookState&quot;: &quot;A String&quot;, # Optional. Indicates whether webhooks are enabled for the intent.
883 &quot;followupIntentInfo&quot;: [ # Read-only. Information about all followup intents that have this intent as
884 # a direct or indirect parent. We populate this field only in the output.
885 { # Represents a single followup intent in the chain.
886 &quot;followupIntentName&quot;: &quot;A String&quot;, # The unique identifier of the followup intent.
887 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
888 &quot;parentFollowupIntentName&quot;: &quot;A String&quot;, # The unique identifier of the followup intent&#x27;s parent.
889 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
890 },
891 ],
892 &quot;displayName&quot;: &quot;A String&quot;, # Required. The name of this intent.
893 &quot;rootFollowupIntentName&quot;: &quot;A String&quot;, # Read-only. The unique identifier of the root intent in the chain of
894 # followup intents. It identifies the correct followup intents chain for
895 # this intent. We populate this field only in the output.
896 #
897 # Format: `projects/&lt;Project ID&gt;/agent/intents/&lt;Intent ID&gt;`.
898 &quot;mlDisabled&quot;: True or False, # Optional. Indicates whether Machine Learning is disabled for the intent.
899 # Note: If `ml_disabled` setting is set to true, then this intent is not
900 # taken into account during inference in `ML ONLY` match mode. Also,
901 # auto-markup in the UI is turned off.
902 &quot;isFallback&quot;: True or False, # Optional. Indicates whether this is a fallback intent.
903 &quot;trainingPhrases&quot;: [ # Optional. The collection of examples that the agent is
904 # trained on.
905 { # Represents an example that the agent is trained on.
906 &quot;parts&quot;: [ # Required. The ordered list of training phrase parts.
907 # The parts are concatenated in order to form the training phrase.
908 #
909 # Note: The API does not automatically annotate training phrases like the
910 # Dialogflow Console does.
911 #
912 # Note: Do not forget to include whitespace at part boundaries,
913 # so the training phrase is well formatted when the parts are concatenated.
914 #
915 # If the training phrase does not need to be annotated with parameters,
916 # you just need a single part with only the Part.text field set.
917 #
918 # If you want to annotate the training phrase, you must create multiple
919 # parts, where the fields of each part are populated in one of two ways:
920 #
921 # - `Part.text` is set to a part of the phrase that has no parameters.
922 # - `Part.text` is set to a part of the phrase that you want to annotate,
923 # and the `entity_type`, `alias`, and `user_defined` fields are all
924 # set.
925 { # Represents a part of a training phrase.
926 &quot;alias&quot;: &quot;A String&quot;, # Optional. The parameter name for the value extracted from the
927 # annotated part of the example.
928 # This field is required for annotated parts of the training phrase.
929 &quot;userDefined&quot;: True or False, # Optional. Indicates whether the text was manually annotated.
930 # This field is set to true when the Dialogflow Console is used to
931 # manually annotate the part. When creating an annotated part with the
932 # API, you must set this to true.
933 &quot;text&quot;: &quot;A String&quot;, # Required. The text for this part.
934 &quot;entityType&quot;: &quot;A String&quot;, # Optional. The entity type name prefixed with `@`.
935 # This field is required for annotated parts of the training phrase.
936 },
937 ],
938 &quot;name&quot;: &quot;A String&quot;, # Output only. The unique identifier of this training phrase.
939 &quot;timesAddedCount&quot;: 42, # Optional. Indicates how many times this example was added to
940 # the intent. Each time a developer adds an existing sample by editing an
941 # intent or training, this counter is increased.
942 &quot;type&quot;: &quot;A String&quot;, # Required. The type of the training phrase.
943 },
944 ],
945 &quot;resetContexts&quot;: True or False, # Optional. Indicates whether to delete all contexts in the current
946 # session when this intent is matched.
947 &quot;parameters&quot;: [ # Optional. The collection of parameters associated with the intent.
948 { # Represents intent parameters.
949 &quot;mandatory&quot;: True or False, # Optional. Indicates whether the parameter is required. That is,
950 # whether the intent cannot be completed without collecting the parameter
951 # value.
952 &quot;defaultValue&quot;: &quot;A String&quot;, # Optional. The default value to use when the `value` yields an empty
953 # result.
954 # Default values can be extracted from contexts by using the following
955 # syntax: `#context_name.parameter_name`.
956 &quot;name&quot;: &quot;A String&quot;, # The unique identifier of this parameter.
957 &quot;isList&quot;: True or False, # Optional. Indicates whether the parameter represents a list of values.
958 &quot;value&quot;: &quot;A String&quot;, # Optional. The definition of the parameter value. It can be:
959 #
960 # - a constant string,
961 # - a parameter value defined as `$parameter_name`,
962 # - an original parameter value defined as `$parameter_name.original`,
963 # - a parameter value from some context defined as
964 # `#context_name.parameter_name`.
965 &quot;displayName&quot;: &quot;A String&quot;, # Required. The name of the parameter.
966 &quot;entityTypeDisplayName&quot;: &quot;A String&quot;, # Optional. The name of the entity type, prefixed with `@`, that
967 # describes values of the parameter. If the parameter is
968 # required, this must be provided.
969 &quot;prompts&quot;: [ # Optional. The collection of prompts that the agent can present to the
970 # user in order to collect a value for the parameter.
971 &quot;A String&quot;,
972 ],
973 },
974 ],
975 },
976 &quot;languageCode&quot;: &quot;A String&quot;, # The language that was triggered during intent detection.
977 # See [Language
978 # Support](https://cloud.google.com/dialogflow/docs/reference/language)
979 # for a list of the currently supported language codes.
980 &quot;outputContexts&quot;: [ # The collection of output contexts. If applicable,
981 # `output_contexts.parameters` contains entries with name
982 # `&lt;parameter name&gt;.original` containing the original parameter values
983 # before the query.
984 { # Represents a context.
985 &quot;lifespanCount&quot;: 42, # Optional. The number of conversational query requests after which the
986 # context expires. The default is `0`. If set to `0`, the context expires
987 # immediately. Contexts expire automatically after 20 minutes if there
988 # are no matching queries.
989 &quot;name&quot;: &quot;A String&quot;, # Required. The unique identifier of the context. Format:
990 # `projects/&lt;Project ID&gt;/agent/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`,
991 # or `projects/&lt;Project ID&gt;/agent/environments/&lt;Environment ID&gt;/users/&lt;User
992 # ID&gt;/sessions/&lt;Session ID&gt;/contexts/&lt;Context ID&gt;`.
993 #
994 # The `Context ID` is always converted to lowercase, may only contain
995 # characters in a-zA-Z0-9_-% and may be at most 250 bytes long.
996 #
997 # If `Environment ID` is not specified, we assume default &#x27;draft&#x27;
998 # environment. If `User ID` is not specified, we assume default &#x27;-&#x27; user.
999 #
1000 # The following context names are reserved for internal use by Dialogflow.
1001 # You should not use these contexts or create contexts with these names:
1002 #
1003 # * `__system_counters__`
1004 # * `*_id_dialog_context`
1005 # * `*_dialog_params_size`
1006 &quot;parameters&quot;: { # Optional. The collection of parameters associated with this context.
1007 #
1008 # Depending on your protocol or client library language, this is a
1009 # map, associative array, symbol table, dictionary, or JSON object
1010 # composed of a collection of (MapKey, MapValue) pairs:
1011 #
1012 # - MapKey type: string
1013 # - MapKey value: parameter name
1014 # - MapValue type:
1015 # - If parameter&#x27;s entity type is a composite entity: map
1016 # - Else: string or number, depending on parameter value type
1017 # - MapValue value:
1018 # - If parameter&#x27;s entity type is a composite entity:
1019 # map from composite entity property names to property values
1020 # - Else: parameter value
1021 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1022 },
1023 },
1024 ],
1025 &quot;fulfillmentMessages&quot;: [ # The collection of rich messages to present to the user.
1026 { # A rich response message.
1027 # Corresponds to the intent `Response` field in the Dialogflow console.
1028 # For more information, see
1029 # [Rich response
1030 # messages](https://cloud.google.com/dialogflow/docs/intents-rich-messages).
1031 &quot;listSelect&quot;: { # The card for presenting a list of options to select from. # The list card response for Actions on Google.
1032 &quot;title&quot;: &quot;A String&quot;, # Optional. The overall title of the list.
1033 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle of the list.
1034 &quot;items&quot;: [ # Required. List items.
1035 { # An item in the list.
1036 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the list item.
1037 &quot;image&quot;: { # The image response message. # Optional. The image to display.
1038 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1039 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1040 # e.g., screen readers.
1041 },
1042 &quot;description&quot;: &quot;A String&quot;, # Optional. The main text describing the item.
1043 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional information about this option.
1044 # dialog.
1045 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
1046 # item in dialog.
1047 &quot;A String&quot;,
1048 ],
1049 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
1050 # response is given.
1051 },
1052 },
1053 ],
1054 },
1055 &quot;quickReplies&quot;: { # The quick replies response message. # The quick replies response.
1056 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the collection of quick replies.
1057 &quot;quickReplies&quot;: [ # Optional. The collection of quick replies.
1058 &quot;A String&quot;,
1059 ],
1060 },
1061 &quot;card&quot;: { # The card response message. # The card response.
1062 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
1063 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
1064 { # Contains information about a button.
1065 &quot;text&quot;: &quot;A String&quot;, # Optional. The text to show on the button.
1066 &quot;postback&quot;: &quot;A String&quot;, # Optional. The text to send back to the Dialogflow API or a URI to
1067 # open.
1068 },
1069 ],
1070 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
1071 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file for the card.
1072 },
1073 &quot;basicCard&quot;: { # The basic card message. Useful for displaying information. # The basic card response for Actions on Google.
1074 &quot;title&quot;: &quot;A String&quot;, # Optional. The title of the card.
1075 &quot;image&quot;: { # The image response message. # Optional. The image for the card.
1076 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1077 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1078 # e.g., screen readers.
1079 },
1080 &quot;formattedText&quot;: &quot;A String&quot;, # Required, unless image is present. The body text of the card.
1081 &quot;buttons&quot;: [ # Optional. The collection of card buttons.
1082 { # The button object that appears at the bottom of a card.
1083 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
1084 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
1085 },
1086 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
1087 },
1088 ],
1089 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. The subtitle of the card.
1090 },
1091 &quot;tableCard&quot;: { # Table card for Actions on Google. # Table card for Actions on Google.
1092 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the card.
1093 &quot;rows&quot;: [ # Optional. Rows in this table of data.
1094 { # Row of TableCard.
1095 &quot;dividerAfter&quot;: True or False, # Optional. Whether to add a visual divider after this row.
1096 &quot;cells&quot;: [ # Optional. List of cells that make up this row.
1097 { # Cell of TableCardRow.
1098 &quot;text&quot;: &quot;A String&quot;, # Required. Text in this cell.
1099 },
1100 ],
1101 },
1102 ],
1103 &quot;subtitle&quot;: &quot;A String&quot;, # Optional. Subtitle to the title.
1104 &quot;columnProperties&quot;: [ # Optional. Display properties for the columns in this table.
1105 { # Column properties for TableCard.
1106 &quot;header&quot;: &quot;A String&quot;, # Required. Column heading.
1107 &quot;horizontalAlignment&quot;: &quot;A String&quot;, # Optional. Defines text alignment for all cells in this column.
1108 },
1109 ],
1110 &quot;image&quot;: { # The image response message. # Optional. Image which should be displayed on the card.
1111 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1112 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1113 # e.g., screen readers.
1114 },
1115 &quot;buttons&quot;: [ # Optional. List of buttons for the card.
1116 { # The button object that appears at the bottom of a card.
1117 &quot;openUriAction&quot;: { # Opens the given URI. # Required. Action to take when a user taps on the button.
1118 &quot;uri&quot;: &quot;A String&quot;, # Required. The HTTP or HTTPS scheme URI.
1119 },
1120 &quot;title&quot;: &quot;A String&quot;, # Required. The title of the button.
1121 },
1122 ],
1123 },
1124 &quot;carouselSelect&quot;: { # The card for presenting a carousel of options to select from. # The carousel card response for Actions on Google.
1125 &quot;items&quot;: [ # Required. Carousel items.
1126 { # An item in the carousel.
1127 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item.
1128 &quot;image&quot;: { # The image response message. # Optional. The image to display.
1129 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1130 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1131 # e.g., screen readers.
1132 },
1133 &quot;description&quot;: &quot;A String&quot;, # Optional. The body text of the card.
1134 &quot;info&quot;: { # Additional info about the select item for when it is triggered in a # Required. Additional info about the option item.
1135 # dialog.
1136 &quot;synonyms&quot;: [ # Optional. A list of synonyms that can also be used to trigger this
1137 # item in dialog.
1138 &quot;A String&quot;,
1139 ],
1140 &quot;key&quot;: &quot;A String&quot;, # Required. A unique key that will be sent back to the agent if this
1141 # response is given.
1142 },
1143 },
1144 ],
1145 },
1146 &quot;linkOutSuggestion&quot;: { # The suggestion chip message that allows the user to jump out to the app # The link out suggestion chip for Actions on Google.
1147 # or website associated with this agent.
1148 &quot;destinationName&quot;: &quot;A String&quot;, # Required. The name of the app or site this chip is linking to.
1149 &quot;uri&quot;: &quot;A String&quot;, # Required. The URI of the app or site to open when the user taps the
1150 # suggestion chip.
1151 },
1152 &quot;browseCarouselCard&quot;: { # Browse Carousel Card for Actions on Google. # Browse carousel card for Actions on Google.
1153 # https://developers.google.com/actions/assistant/responses#browsing_carousel
1154 &quot;items&quot;: [ # Required. List of items in the Browse Carousel Card. Minimum of two
1155 # items, maximum of ten.
1156 { # Browsing carousel tile
1157 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of the carousel item. Maximum of four lines of
1158 # text.
1159 &quot;openUriAction&quot;: { # Actions on Google action to open a given url. # Required. Action to present to the user.
1160 &quot;urlTypeHint&quot;: &quot;A String&quot;, # Optional. Specifies the type of viewer that is used when opening
1161 # the URL. Defaults to opening via web browser.
1162 &quot;url&quot;: &quot;A String&quot;, # Required. URL
1163 },
1164 &quot;footer&quot;: &quot;A String&quot;, # Optional. Text that appears at the bottom of the Browse Carousel
1165 # Card. Maximum of one line of text.
1166 &quot;title&quot;: &quot;A String&quot;, # Required. Title of the carousel item. Maximum of two lines of text.
1167 &quot;image&quot;: { # The image response message. # Optional. Hero image for the carousel item.
1168 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1169 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1170 # e.g., screen readers.
1171 },
1172 },
1173 ],
1174 &quot;imageDisplayOptions&quot;: &quot;A String&quot;, # Optional. Settings for displaying the image. Applies to every image in
1175 # items.
1176 },
1177 &quot;simpleResponses&quot;: { # The collection of simple response candidates. # The voice and text-only responses for Actions on Google.
1178 # This message in `QueryResult.fulfillment_messages` and
1179 # `WebhookResponse.fulfillment_messages` should contain only one
1180 # `SimpleResponse`.
1181 &quot;simpleResponses&quot;: [ # Required. The list of simple responses.
1182 { # The simple response message containing speech or text.
1183 &quot;displayText&quot;: &quot;A String&quot;, # Optional. The text to display.
1184 &quot;textToSpeech&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. The plain text of the
1185 # speech output. Mutually exclusive with ssml.
1186 &quot;ssml&quot;: &quot;A String&quot;, # One of text_to_speech or ssml must be provided. Structured spoken
1187 # response to the user in the SSML format. Mutually exclusive with
1188 # text_to_speech.
1189 },
1190 ],
1191 },
1192 &quot;mediaContent&quot;: { # The media content card for Actions on Google. # The media content card for Actions on Google.
1193 &quot;mediaType&quot;: &quot;A String&quot;, # Optional. What type of media is the content (ie &quot;audio&quot;).
1194 &quot;mediaObjects&quot;: [ # Required. List of media objects.
1195 { # Response media object for media content card.
1196 &quot;icon&quot;: { # The image response message. # Optional. Icon to display above media content.
1197 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1198 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1199 # e.g., screen readers.
1200 },
1201 &quot;largeImage&quot;: { # The image response message. # Optional. Image to display above media content.
1202 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1203 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1204 # e.g., screen readers.
1205 },
1206 &quot;name&quot;: &quot;A String&quot;, # Required. Name of media card.
1207 &quot;description&quot;: &quot;A String&quot;, # Optional. Description of media card.
1208 &quot;contentUrl&quot;: &quot;A String&quot;, # Required. Url where the media is stored.
1209 },
1210 ],
1211 },
1212 &quot;image&quot;: { # The image response message. # The image response.
1213 &quot;imageUri&quot;: &quot;A String&quot;, # Optional. The public URI to an image file.
1214 &quot;accessibilityText&quot;: &quot;A String&quot;, # Optional. A text description of the image to be used for accessibility,
1215 # e.g., screen readers.
1216 },
1217 &quot;payload&quot;: { # A custom platform-specific response.
1218 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1219 },
1220 &quot;text&quot;: { # The text response message. # The text response.
1221 &quot;text&quot;: [ # Optional. The collection of the agent&#x27;s responses.
1222 &quot;A String&quot;,
1223 ],
1224 },
1225 &quot;suggestions&quot;: { # The collection of suggestions. # The suggestion chips for Actions on Google.
1226 &quot;suggestions&quot;: [ # Required. The list of suggested replies.
1227 { # The suggestion chip message that the user can tap to quickly post a reply
1228 # to the conversation.
1229 &quot;title&quot;: &quot;A String&quot;, # Required. The text shown the in the suggestion chip.
1230 },
1231 ],
1232 },
1233 &quot;platform&quot;: &quot;A String&quot;, # Optional. The platform that this message is intended for.
1234 },
1235 ],
1236 &quot;webhookPayload&quot;: { # If the query was fulfilled by a webhook call, this field is set to the
1237 # value of the `payload` field returned in the webhook response.
1238 &quot;a_key&quot;: &quot;&quot;, # Properties of the object.
1239 },
1240 &quot;action&quot;: &quot;A String&quot;, # The action name from the matched intent.
1241 &quot;webhookSource&quot;: &quot;A String&quot;, # If the query was fulfilled by a webhook call, this field is set to the
1242 # value of the `source` field returned in the webhook response.
1243 },
1244 &quot;outputAudio&quot;: &quot;A String&quot;, # The audio data bytes encoded as specified in the request.
1245 # Note: The output audio is generated based on the values of default platform
1246 # text responses found in the `query_result.fulfillment_messages` field. If
1247 # multiple default text responses exist, they will be concatenated when
1248 # generating audio. If no default platform text responses exist, the
1249 # generated audio content will be empty.
1250 &quot;webhookStatus&quot;: { # The `Status` type defines a logical error model that is suitable for # Specifies the status of the webhook request.
1251 # different programming environments, including REST APIs and RPC APIs. It is
1252 # used by [gRPC](https://github.com/grpc). Each `Status` message contains
1253 # three pieces of data: error code, error message, and error details.
1254 #
1255 # You can find out more about this error model and how to work with it in the
1256 # [API Design Guide](https://cloud.google.com/apis/design/errors).
1257 &quot;details&quot;: [ # A list of messages that carry the error details. There is a common set of
1258 # message types for APIs to use.
1259 {
1260 &quot;a_key&quot;: &quot;&quot;, # Properties of the object. Contains field @type with type URL.
1261 },
1262 ],
1263 &quot;code&quot;: 42, # The status code, which should be an enum value of google.rpc.Code.
1264 &quot;message&quot;: &quot;A String&quot;, # A developer-facing error message, which should be in English. Any
1265 # user-facing error message should be localized and sent in the
1266 # google.rpc.Status.details field, or localized by the client.
1267 },
1268 &quot;responseId&quot;: &quot;A String&quot;, # The unique identifier of the response. It can be used to
1269 # locate a response in the training example set or for reporting issues.
1270 }</pre>
1271</div>
1272
1273</body></html>