10th September 2017 at 12:53 pm #339
I am trying to connect to Google BigQuery (I just started with that). In the settings.properties file I see an example for bigquery as follow:
Assuming that ‘rosy-ratio-132710’ is the project you are pointing to what is the 100? And how can you add username/password or json key file.
Thanks for any answer.
Jeroen11th September 2017 at 12:29 pm #341
Thank you for your query, yes you are correct you would add the connection properties in settings.properties file: below is the explanation:
Project Name: rosy-ratio-132710
Wait Time : 100 in milliseconds
Usually properties like keys, project ID are stored in the json file which will be downloaded when you create a key, to create a key for your project follow the steps:
Log on to your console:
Go to Service Account Tab from the left menu bar
Whilst in the Service Account Tab on the right hand side click on ellipsis to expand the options, you will find option to create a key
Once the key is generated is will automatically download the Json with all the properties.
You are also require to create Environment system variable with the following name:
And assign file location in variable name (The json file which you have downloaded)
Once the above steps have been followed, you can now update your settings.properties file.
I hope the above will help you to connect the Google big query: you can get step by step help for google big queries from the following link:
BR, Debbie20th September 2017 at 2:16 pm #344
A very late followup.
I succeeded in making a connection to Google BigQuery as you described. When I create a query on a table with 12 columns and about 40k records I see the correct columnnames in the output columns tab of the dataset. So the connection is there.
But the data is not getting there! When add a limit 100 to the query and try to preview it it shows the loading… text for ever (or at least the 15 minutes I tried). In another tool we are evaluating the same query takes 2 seconds.
I tried looking in the logs but nothing to see.
In the apache logs directory there is a localhost_access_log with the date that shows one entry at the time I was runnng the preview.
0:0:0:0:0:0:0:1 – – [20/Sep/2017:15:11:08 +0200] “POST /ais/rest/dataset/preview HTTP/1.1” 500 5213
Mayby the 500 or 5213 mean something.
Any help is appreciated.
Jeroen22nd September 2017 at 12:01 pm #345
Thank you for your query, we are really sorry that you are facing issue when connecting with Google big query, what I can figure out from Error 500 is that it isn’t getting any data this could be connection issue, however if you can pull data from other tool/system I do not see this as a connection issue.
We have tried with various dataset with 50k+ records and all seems working fine. I would suggest that you take our Free live technical support which will enable our technical team to provide you remote assistance live and help you to overcome the problem you are currently facing extracting the data. This will enable our technical team to run some troubleshoot and find the error point.
Kindly drop email at email@example.com with your availability and they will arrange everything for you.
Joe Dallesandro23rd September 2017 at 8:15 pm #346
I experimented a little further. In bigquery I have a json field (type string) with contains a json string.
I create a view in which I select the fields from json that I want to use. Using the following SQL :
json_extract_scalar(json, ‘$.category.name’) as categoryName
All fields work till this one. Then it says “no records to show”.
I also see some timestamp fields in de generated view that stay empty (although the other data is shown). U generated them using :
timestamp(substr(json_extract_scalar(json, ‘$.callDate’),0,22) ) as callDate,
I will investigate further.
Jeroen25th September 2017 at 2:02 pm #347
Could you please share a sample data. so that i will try at my end.
10th April 2018 at 7:20 am #404
- This reply was modified 1 year, 6 months ago by joe_dallesandro.
Hello, Finally a follow up on this …
I saw the new AIS 2.1 on your site and installed it. Added BigQuery datasource from the GUI (thanks for that) and tried again to get my data out of BigQuery.
But then I realized I was querying a view that was created with BigQuery’s standardSQL and the default SQL AIS is expecting (as is BigQuery itself) is legacySQL.
So by adding #standardSQL before my actual query I get the results I expected. My dates are also shown in the correct way. So AIS was doing it right all along :-).
But I do have found another issue I would like to present to you.
When I make a Query in BigQuery Query Editor (in the browser), using this query on the Public Datasets > nyc-tcl:green dataset.
select pickup_datetime from
Then the timestamp fields are shown as:
2014-03-27 10:40:00.000 UTC
2014-01-08 06:45:20.000 UTC
When I copy the same query to AIS and run it the field is also recognized as a timestamp field but the results are shown like this:
changing the query to:
select datetime(pickup_datetime) from
fixes this but it would be nice when timestamp fields return a human readable format like the query does when run in BigQuery itself.
I tend to write the queries in the BQ Query Editor (color-coded, auto-complete, ect.) and copy then to my visualization tool of choice.
If this is the desired behaviour I respect that but as it is known that the field is of a type timestamp it seams strange to me to present it as a float?
Jeroen11th April 2018 at 12:52 pm #405
You can describe TIMESTAMP data types as either UNIX timestamps or calendar datetimes. BigQuery stores TIMESTAMP data internally as a UNIX timestamp with microsecond precision. check here for more information https://cloud.google.com/bigquery/data-types
Currently in AIS it support legacy query and standard SQL query support should be added shortly. If this is blocking just send an email to firstname.lastname@example.org and they should provide you patch or you can use the workaround like you did or format timestamp using function FORMAT_UTC_USEC.
You must be logged in to reply to this topic.