How to use timestamp_millis method in localstack

Best Python code snippet using localstack_python

create-bigquery-view-template.py

Source:create-bigquery-view-template.py Github

copy

Full Screen

1# Copyright (C) 2018 - 2019 Robert Sahlin2# This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.3# This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details.4# You should have received a copy of the GNU Affero General Public License along with this program. If not, see <https://www.gnu.org/licenses/>.5# Creates a bigquery dataset.6# gcloud deployment-manager deployments create bigquery-test-view --template create-bigquery-view-template.py --properties streamId:ua1234567897def AlphaNum(stream):8 return "".join([ c if c.isalnum() else "" for c in stream ])9def GenerateConfig(context):10 """Generate configuration."""11 resources = []12# Event view13 resources.append({14 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-event', #bigquery-dataset-ua123456789-view-event15 'type': 'bigquery.v2.table',16 'properties': {17 'datasetId': AlphaNum(context.properties['streamId']),18 'tableReference': {19 'projectId': context.env["project"],20 'datasetId': AlphaNum(context.properties['streamId']),21 'tableId': 'event'22 },23 'view': {24 'query': '''25SELECT 26type,27clientId,28userId,29epochMillis,30date,31FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,32FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,33FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,34FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,35FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,36FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,37FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,38(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "eventCategory") as eventCategory,39(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "eventAction") as eventAction,40(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "eventLabel") as eventLabel,41(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "eventValue") as eventValue,42(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "url") as landingUrl,43(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "host") as landingHost,44(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "path") as landingPath45FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`46WHERE type = "event"''',47 'useLegacySql': False48 }49 }50 })51# Exception view52 resources.append({53 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-exception', #bigquery-dataset-ua123456789-view-exception54 'type': 'bigquery.v2.table',55 'properties': {56 'datasetId': AlphaNum(context.properties['streamId']),57 'tableReference': {58 'projectId': context.env["project"],59 'datasetId': AlphaNum(context.properties['streamId']),60 'tableId': 'exception'61 },62 'view': {63 'query': '''64SELECT 65type,66clientId,67userId,68epochMillis,69date,70FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,71FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,72FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,73FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,74FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,75FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,76FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,77(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "exceptionDescription") as exceptionDescription,78(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "exceptionFatal") as exceptionFatal,79(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "url") as landingUrl,80(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "host") as landingHost,81(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "path") as landingPath82FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`83WHERE type = "exception"''',84 'useLegacySql': False85 }86 }87 })88# Impression view89 resources.append({90 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-impression', #bigquery-dataset-ua123456789-view-impression91 'type': 'bigquery.v2.table',92 'properties': {93 'datasetId': AlphaNum(context.properties['streamId']),94 'tableReference': {95 'projectId': context.env["project"],96 'datasetId': AlphaNum(context.properties['streamId']),97 'tableId': 'impression'98 },99 'view': {100 'query': '''101SELECT 102type,103clientId,104userId,105epochMillis,106date,107FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,108FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,109FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,110FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,111FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,112FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,113FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,114(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productListName") as productListName,115(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productSku") as productSku,116(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productName") as productName,117(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productBrand") as productBrand,118(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productCategory") as productCategory,119(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productVariant") as productVariant,120(SELECT param.value.floatValue FROM UNNEST(params) param WHERE param.key = "productPrice") as productPrice,121(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "productPosition") as productPosition122FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`123WHERE type = "productImpression"''',124 'useLegacySql': False125 }126 }127 })128# Impression view129 resources.append({130 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-pageview', #bigquery-dataset-ua123456789-view-pageview131 'type': 'bigquery.v2.table',132 'properties': {133 'datasetId': AlphaNum(context.properties['streamId']),134 'tableReference': {135 'projectId': context.env["project"],136 'datasetId': AlphaNum(context.properties['streamId']),137 'tableId': 'pageview'138 },139 'view': {140 'query': '''141SELECT 142type,143clientId,144userId,145epochMillis,146date,147FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,148FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,149FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,150FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,151FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,152FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,153FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,154(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "url") as url,155(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "host") as host,156(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "path") as path157FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`158WHERE type = "pageview"''',159 'useLegacySql': False160 }161 }162 })163# Product view164 resources.append({165 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-product', #bigquery-dataset-ua123456789-view-product166 'type': 'bigquery.v2.table',167 'properties': {168 'datasetId': AlphaNum(context.properties['streamId']),169 'tableReference': {170 'projectId': context.env["project"],171 'datasetId': AlphaNum(context.properties['streamId']),172 'tableId': 'product'173 },174 'view': {175 'query': '''176SELECT 177type,178clientId,179userId,180epochMillis,181date,182FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,183FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,184FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,185FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,186FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,187FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,188FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,189(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productSku") as productSku,190(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productName") as productName,191(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productBrand") as productBrand,192(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productCategory") as productCategory,193(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productVariant") as productVariant,194(SELECT param.value.floatValue FROM UNNEST(params) param WHERE param.key = "productPrice") as productPrice,195(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "productQuantity") as productQuantity,196(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productCouponCode") as productCouponCode,197(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "productPosition") as productPosition,198(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productAction") as productAction,199(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "productActionList") as productActionList,200(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "transactionId") as transactionId,201(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "checkoutStep") as checkoutStep,202(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "checkoutStepOption") as checkoutStepOption203FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`204WHERE type IN ("product_detail", "product_click", "product_add", "product_remove", "product_checkout", "product_purchase", "product_refund")''',205 'useLegacySql': False206 }207 }208 })209# Promotion view210 resources.append({211 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-promotion', #bigquery-dataset-ua123456789-view-promotion212 'type': 'bigquery.v2.table',213 'properties': {214 'datasetId': AlphaNum(context.properties['streamId']),215 'tableReference': {216 'projectId': context.env["project"],217 'datasetId': AlphaNum(context.properties['streamId']),218 'tableId': 'promotion'219 },220 'view': {221 'query': '''222SELECT 223type,224clientId,225userId,226epochMillis,227date,228FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,229FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,230FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,231FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,232FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,233FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,234FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,235(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "promotionId") as promotionId,236(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "promotionName") as promotionName,237(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "promotionCreative") as promotionCreative,238(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "promotionPosition") as promotionPosition,239(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "promotionAction") as promotionAction240FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`241WHERE type = "promotion"''',242 'useLegacySql': False243 }244 }245 })246# Search view247 resources.append({248 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-search', #bigquery-dataset-ua123456789-view-search249 'type': 'bigquery.v2.table',250 'properties': {251 'datasetId': AlphaNum(context.properties['streamId']),252 'tableReference': {253 'projectId': context.env["project"],254 'datasetId': AlphaNum(context.properties['streamId']),255 'tableId': 'search'256 },257 'view': {258 'query': '''259SELECT 260type,261clientId,262userId,263epochMillis,264date,265FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,266FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,267FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,268FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,269FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,270FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,271FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,272(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "siteSearchTerm") as siteSearchTerm,273(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "siteSearchURL") as siteSearchURL,274(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "siteSearchPath") as siteSearchPath275FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`276WHERE type = "siteSearch"''',277 'useLegacySql': False278 }279 }280 })281# Social view282 resources.append({283 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-social', #bigquery-dataset-ua123456789-view-social284 'type': 'bigquery.v2.table',285 'properties': {286 'datasetId': AlphaNum(context.properties['streamId']),287 'tableReference': {288 'projectId': context.env["project"],289 'datasetId': AlphaNum(context.properties['streamId']),290 'tableId': 'social'291 },292 'view': {293 'query': '''294SELECT 295type,296clientId,297userId,298epochMillis,299date,300FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,301FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,302FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,303FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,304FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,305FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,306FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,307(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "socialNetwork") as socialNetwork,308(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "socialAction") as socialAction,309(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "socialActionTarget") as socialActionTarget310FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`311WHERE type = "social"''',312 'useLegacySql': False313 }314 }315 })316# Timing view317 resources.append({318 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-timing', #bigquery-dataset-ua123456789-view-timing319 'type': 'bigquery.v2.table',320 'properties': {321 'datasetId': AlphaNum(context.properties['streamId']),322 'tableReference': {323 'projectId': context.env["project"],324 'datasetId': AlphaNum(context.properties['streamId']),325 'tableId': 'timing'326 },327 'view': {328 'query': '''329SELECT 330type,331clientId,332userId,333epochMillis,334date,335FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,336FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,337FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,338FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,339FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,340FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,341FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,342(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "userTimingCategory") as userTimingCategory,343(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "userTimingVariableName") as userTimingVariableName,344(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "userTimingTime") as userTimingTime,345(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "userTimingLabel") as userTimingLabel,346(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "pageLoadTime") as pageLoadTime,347(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "dnsTime") as dnsTime,348(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "pageDownloadTime") as pageDownloadTime,349(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "redirectResponseTime") as redirectResponseTime,350(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "tcpConnectTime") as tcpConnectTime,351(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "serverResponseTime") as serverResponseTime,352(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "domInteractiveTime") as domInteractiveTime,353(SELECT param.value.intValue FROM UNNEST(params) param WHERE param.key = "contentLoadTime") as contentLoadTime354FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`355WHERE type = "timing"''',356 'useLegacySql': False357 }358 }359 })360# Traffic view361 resources.append({362 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-traffic', #bigquery-dataset-ua123456789-view-traffic363 'type': 'bigquery.v2.table',364 'properties': {365 'datasetId': AlphaNum(context.properties['streamId']),366 'tableReference': {367 'projectId': context.env["project"],368 'datasetId': AlphaNum(context.properties['streamId']),369 'tableId': 'traffic'370 },371 'view': {372 'query': '''373SELECT 374type,375clientId,376userId,377epochMillis,378date,379FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,380FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,381FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,382FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,383FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,384FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,385FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,386(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "campaignName") as campaignName,387(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "campaignSource") as campaignSource,388(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "campaignMedium") as campaignMedium,389(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "campaignContent") as campaignContent,390(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "campaignKeyword") as campaignKeyword,391(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "campaignId") as campaignId,392(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "googleAdwordsId") as googleAdwordsId,393(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "googleDisplayId") as googleDisplayId,394(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "referer") as referer,395(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "refererHost") as refererHost,396(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "refererPath") as refererPath,397(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "url") as landingUrl,398(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "host") as landingHost,399(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "path") as landingPath400FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`401WHERE type = "traffic"''',402 'useLegacySql': False403 }404 }405 })406# Traffic view407 resources.append({408 'name': 'bigquery-dataset-' + AlphaNum(context.properties['streamId']) + '-view-transaction', #bigquery-dataset-ua123456789-view-transaction409 'type': 'bigquery.v2.table',410 'properties': {411 'datasetId': AlphaNum(context.properties['streamId']),412 'tableReference': {413 'projectId': context.env["project"],414 'datasetId': AlphaNum(context.properties['streamId']),415 'tableId': 'transaction'416 },417 'view': {418 'query': '''419SELECT 420type,421clientId,422userId,423epochMillis,424date,425FORMAT_TIMESTAMP("%X", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as time,426FORMAT_TIMESTAMP("%G", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as year,427FORMAT_TIMESTAMP("%m", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as month,428FORMAT_TIMESTAMP("%d", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as day,429FORMAT_TIMESTAMP("%H", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as hour,430FORMAT_TIMESTAMP("%W", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as week,431FORMAT_TIMESTAMP("%w", TIMESTAMP_MILLIS(epochMillis), "Europe/Stockholm") as weekday,432(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "transactionId") as transactionId,433(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "affiliation") as affiliation,434(SELECT param.value.floatValue FROM UNNEST(params) param WHERE param.key = "revenue") as revenue,435(SELECT param.value.floatValue FROM UNNEST(params) param WHERE param.key = "tax") as tax,436(SELECT param.value.floatValue FROM UNNEST(params) param WHERE param.key = "shipping") as shipping,437(SELECT param.value.stringValue FROM UNNEST(params) param WHERE param.key = "couponCode") as couponCode438FROM `'''+ context.env["project"] + '.' + AlphaNum(context.properties['streamId']) + '''.entities`439WHERE type = "transaction"''',440 'useLegacySql': False441 }442 }443 })...

Full Screen

Full Screen

gen_bigquery_reader_ops.py

Source:gen_bigquery_reader_ops.py Github

copy

Full Screen

1"""Python wrappers around TensorFlow ops.2This file is MACHINE GENERATED! Do not edit.3Original C++ source file: gen_bigquery_reader_ops.cc4"""5import collections as _collections6import six as _six7from tensorflow.python import pywrap_tensorflow as _pywrap_tensorflow8from tensorflow.python.eager import context as _context9from tensorflow.python.eager import core as _core10from tensorflow.python.eager import execute as _execute11from tensorflow.python.framework import dtypes as _dtypes12from tensorflow.python.framework import errors as _errors13from tensorflow.python.framework import tensor_shape as _tensor_shape14from tensorflow.core.framework import op_def_pb2 as _op_def_pb215# Needed to trigger the call to _set_call_cpp_shape_fn.16from tensorflow.python.framework import common_shapes as _common_shapes17from tensorflow.python.framework import op_def_registry as _op_def_registry18from tensorflow.python.framework import ops as _ops19from tensorflow.python.framework import op_def_library as _op_def_library20from tensorflow.python.util.tf_export import tf_export21@tf_export('big_query_reader')22def big_query_reader(project_id, dataset_id, table_id, columns, timestamp_millis, container="", shared_name="", test_end_point="", name=None):23 r"""A Reader that outputs rows from a BigQuery table as tensorflow Examples.24 Args:25 project_id: A `string`. GCP project ID.26 dataset_id: A `string`. BigQuery Dataset ID.27 table_id: A `string`. Table to read.28 columns: A list of `strings`.29 List of columns to read. Leave empty to read all columns.30 timestamp_millis: An `int`.31 Table snapshot timestamp in millis since epoch. Relative32 (negative or zero) snapshot times are not allowed. For more details, see33 'Table Decorators' in BigQuery docs.34 container: An optional `string`. Defaults to `""`.35 If non-empty, this reader is placed in the given container.36 Otherwise, a default container is used.37 shared_name: An optional `string`. Defaults to `""`.38 If non-empty, this reader is named in the given bucket39 with this shared_name. Otherwise, the node name is used instead.40 test_end_point: An optional `string`. Defaults to `""`.41 Do not use. For testing purposes only.42 name: A name for the operation (optional).43 Returns:44 A `Tensor` of type mutable `string`. The handle to reference the Reader.45 """46 _ctx = _context._context47 if _ctx is None or not _ctx._eager_context.is_eager:48 project_id = _execute.make_str(project_id, "project_id")49 dataset_id = _execute.make_str(dataset_id, "dataset_id")50 table_id = _execute.make_str(table_id, "table_id")51 if not isinstance(columns, (list, tuple)):52 raise TypeError(53 "Expected list for 'columns' argument to "54 "'big_query_reader' Op, not %r." % columns)55 columns = [_execute.make_str(_s, "columns") for _s in columns]56 timestamp_millis = _execute.make_int(timestamp_millis, "timestamp_millis")57 if container is None:58 container = ""59 container = _execute.make_str(container, "container")60 if shared_name is None:61 shared_name = ""62 shared_name = _execute.make_str(shared_name, "shared_name")63 if test_end_point is None:64 test_end_point = ""65 test_end_point = _execute.make_str(test_end_point, "test_end_point")66 _, _, _op = _op_def_lib._apply_op_helper(67 "BigQueryReader", project_id=project_id, dataset_id=dataset_id,68 table_id=table_id, columns=columns, timestamp_millis=timestamp_millis,69 container=container, shared_name=shared_name,70 test_end_point=test_end_point, name=name)71 _result = _op.outputs[:]72 _inputs_flat = _op.inputs73 _attrs = ("container", _op.get_attr("container"), "shared_name",74 _op.get_attr("shared_name"), "project_id",75 _op.get_attr("project_id"), "dataset_id",76 _op.get_attr("dataset_id"), "table_id",77 _op.get_attr("table_id"), "columns", _op.get_attr("columns"),78 "timestamp_millis", _op.get_attr("timestamp_millis"),79 "test_end_point", _op.get_attr("test_end_point"))80 _execute.record_gradient(81 "BigQueryReader", _inputs_flat, _attrs, _result, name)82 _result, = _result83 return _result84 else:85 raise RuntimeError("big_query_reader op does not support eager execution. Arg 'reader_handle' is a ref.")86 raise RuntimeError("big_query_reader op does not support eager execution. Arg 'reader_handle' is a ref.")87@tf_export('generate_big_query_reader_partitions')88def generate_big_query_reader_partitions(project_id, dataset_id, table_id, columns, timestamp_millis, num_partitions, test_end_point="", name=None):89 r"""Generates serialized partition messages suitable for batch reads.90 This op should not be used directly by clients. Instead, the91 bigquery_reader_ops.py file defines a clean interface to the reader.92 Args:93 project_id: A `string`. GCP project ID.94 dataset_id: A `string`. BigQuery Dataset ID.95 table_id: A `string`. Table to read.96 columns: A list of `strings`.97 List of columns to read. Leave empty to read all columns.98 timestamp_millis: An `int`.99 Table snapshot timestamp in millis since epoch. Relative100 (negative or zero) snapshot times are not allowed. For more details, see101 'Table Decorators' in BigQuery docs.102 num_partitions: An `int`. Number of partitions to split the table into.103 test_end_point: An optional `string`. Defaults to `""`.104 Do not use. For testing purposes only.105 name: A name for the operation (optional).106 Returns:107 A `Tensor` of type `string`. Serialized table partitions.108 """109 _ctx = _context._context110 if _ctx is None or not _ctx._eager_context.is_eager:111 project_id = _execute.make_str(project_id, "project_id")112 dataset_id = _execute.make_str(dataset_id, "dataset_id")113 table_id = _execute.make_str(table_id, "table_id")114 if not isinstance(columns, (list, tuple)):115 raise TypeError(116 "Expected list for 'columns' argument to "117 "'generate_big_query_reader_partitions' Op, not %r." % columns)118 columns = [_execute.make_str(_s, "columns") for _s in columns]119 timestamp_millis = _execute.make_int(timestamp_millis, "timestamp_millis")120 num_partitions = _execute.make_int(num_partitions, "num_partitions")121 if test_end_point is None:122 test_end_point = ""123 test_end_point = _execute.make_str(test_end_point, "test_end_point")124 _, _, _op = _op_def_lib._apply_op_helper(125 "GenerateBigQueryReaderPartitions", project_id=project_id,126 dataset_id=dataset_id, table_id=table_id, columns=columns,127 timestamp_millis=timestamp_millis, num_partitions=num_partitions,128 test_end_point=test_end_point, name=name)129 _result = _op.outputs[:]130 _inputs_flat = _op.inputs131 _attrs = ("project_id", _op.get_attr("project_id"), "dataset_id",132 _op.get_attr("dataset_id"), "table_id",133 _op.get_attr("table_id"), "columns", _op.get_attr("columns"),134 "timestamp_millis", _op.get_attr("timestamp_millis"),135 "num_partitions", _op.get_attr("num_partitions"),136 "test_end_point", _op.get_attr("test_end_point"))137 _execute.record_gradient(138 "GenerateBigQueryReaderPartitions", _inputs_flat, _attrs, _result, name)139 _result, = _result140 return _result141 else:142 try:143 _result = _pywrap_tensorflow.TFE_Py_FastPathExecute(144 _ctx._context_handle, _ctx._eager_context.device_name,145 "GenerateBigQueryReaderPartitions", name,146 _ctx._post_execution_callbacks, "project_id", project_id,147 "dataset_id", dataset_id, "table_id", table_id, "columns", columns,148 "timestamp_millis", timestamp_millis, "num_partitions",149 num_partitions, "test_end_point", test_end_point)150 return _result151 except _core._FallbackException:152 return generate_big_query_reader_partitions_eager_fallback(153 project_id=project_id, dataset_id=dataset_id, table_id=table_id,154 columns=columns, timestamp_millis=timestamp_millis,155 num_partitions=num_partitions, test_end_point=test_end_point,156 name=name, ctx=_ctx)157 except _core._NotOkStatusException as e:158 if name is not None:159 message = e.message + " name: " + name160 else:161 message = e.message162 _six.raise_from(_core._status_to_exception(e.code, message), None)163def generate_big_query_reader_partitions_eager_fallback(project_id, dataset_id, table_id, columns, timestamp_millis, num_partitions, test_end_point="", name=None, ctx=None):164 r"""This is the slowpath function for Eager mode.165 This is for function generate_big_query_reader_partitions166 """167 _ctx = ctx if ctx else _context.context()168 project_id = _execute.make_str(project_id, "project_id")169 dataset_id = _execute.make_str(dataset_id, "dataset_id")170 table_id = _execute.make_str(table_id, "table_id")171 if not isinstance(columns, (list, tuple)):172 raise TypeError(173 "Expected list for 'columns' argument to "174 "'generate_big_query_reader_partitions' Op, not %r." % columns)175 columns = [_execute.make_str(_s, "columns") for _s in columns]176 timestamp_millis = _execute.make_int(timestamp_millis, "timestamp_millis")177 num_partitions = _execute.make_int(num_partitions, "num_partitions")178 if test_end_point is None:179 test_end_point = ""180 test_end_point = _execute.make_str(test_end_point, "test_end_point")181 _inputs_flat = []182 _attrs = ("project_id", project_id, "dataset_id", dataset_id, "table_id",183 table_id, "columns", columns, "timestamp_millis", timestamp_millis,184 "num_partitions", num_partitions, "test_end_point", test_end_point)185 _result = _execute.execute(b"GenerateBigQueryReaderPartitions", 1,186 inputs=_inputs_flat, attrs=_attrs, ctx=_ctx,187 name=name)188 _execute.record_gradient(189 "GenerateBigQueryReaderPartitions", _inputs_flat, _attrs, _result, name)190 _result, = _result191 return _result192def _InitOpDefLibrary(op_list_proto_bytes):193 op_list = _op_def_pb2.OpList()194 op_list.ParseFromString(op_list_proto_bytes)195 _op_def_registry.register_op_list(op_list)196 op_def_lib = _op_def_library.OpDefLibrary()197 op_def_lib.add_op_list(op_list)198 return op_def_lib199# op {200# name: "BigQueryReader"201# output_arg {202# name: "reader_handle"203# type: DT_STRING204# is_ref: true205# }206# attr {207# name: "container"208# type: "string"209# default_value {210# s: ""211# }212# }213# attr {214# name: "shared_name"215# type: "string"216# default_value {217# s: ""218# }219# }220# attr {221# name: "project_id"222# type: "string"223# }224# attr {225# name: "dataset_id"226# type: "string"227# }228# attr {229# name: "table_id"230# type: "string"231# }232# attr {233# name: "columns"234# type: "list(string)"235# }236# attr {237# name: "timestamp_millis"238# type: "int"239# }240# attr {241# name: "test_end_point"242# type: "string"243# default_value {244# s: ""245# }246# }247# is_stateful: true248# }249# op {250# name: "GenerateBigQueryReaderPartitions"251# output_arg {252# name: "partitions"253# type: DT_STRING254# }255# attr {256# name: "project_id"257# type: "string"258# }259# attr {260# name: "dataset_id"261# type: "string"262# }263# attr {264# name: "table_id"265# type: "string"266# }267# attr {268# name: "columns"269# type: "list(string)"270# }271# attr {272# name: "timestamp_millis"273# type: "int"274# }275# attr {276# name: "num_partitions"277# type: "int"278# }279# attr {280# name: "test_end_point"281# type: "string"282# default_value {283# s: ""284# }285# }286# }...

Full Screen

Full Screen

Automation Testing Tutorials

Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.

LambdaTest Learning Hubs:

YouTube

You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.

Run localstack automation tests on LambdaTest cloud grid

Perform automation testing on 3000+ real desktop and mobile devices online.

Try LambdaTest Now !!

Get 100 minutes of automation test minutes FREE!!

Next-Gen App & Browser Testing Cloud

Was this article helpful?

Helpful

NotHelpful