Package Exports
- firestore-to-bigquery-export
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (firestore-to-bigquery-export) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
Firestore to BigQuery export
An automatic tool for copying and converting Cloud Firestore data to BigQuery.
Firestore is awesome. BigQuery is awesome. But transferring data from Firestore to BigQuery sucks. This package lets you plug and play your way out of config hell.
- Create a BigQuery dataset with tables corresponding to your Firestore collections.
- Table schemas are automatically generated based on your document property data types.
- Convert and copy your Firestore collections to BigQuery.
This package doesn't write anything to Firestore.
Contents
Installation
npm i firestore-to-bigquery-export
Then
import bigExport from 'firestore-to-bigquery-export'
Or
const bigExport = require('firestore-to-bigquery-export')
Then
const GCPSA = require('./Your-Service-Account-File.json')
bigExport.setBigQueryConfig(GCPSA)
bigExport.setFirebaseConfig(GCPSA)How to
API
bigExport.setBigQueryConfig(serviceAccountFile:JSON)bigExport.setFirebaseConfig(serviceAccountFile:JSON)bigExport.createBigQueryTables(datasetID:string, collectionNames:Array):Promise<number>bigExport.copyCollectionsToBigQuery(datasetID:string, collectionNames:Array):Promise<number>bigExport.deleteBigQueryTables(datasetID:string, tableNames:Array):Promise<number>
Examples
/* Initialize BigQuery dataset named 'firestore' with four tables.
* Table names equal collection names from Firestore.
* Table schemas will be autogenerated based on the documents in the collections.
*/
bigExport.createBigQueryTables('firestore', [
'payments',
'profiles',
'ratings',
'users'
])
.then(res => {
console.log(res)
})
.catch(error => {
console.error(error)
})Then, you can transport your data:
/* Copying and converting all documents in the given collections.
* Inserting each document as a row in tables with the same name as the collection, in the dataset named 'firestore'.
* Cells (document properties) that doesn't match the table schema will be rejected.
*/
bigExport.copyCollectionsToBigQuery('firestore', [
'payments',
'profiles',
'ratings',
'users'
])
.then(res => {
console.log(res)
})
.catch(error => {
console.error(error)
})After that, you may want to refresh your data. For the time being, the quick and dirty way is to delete your tables and make new ones:
// Deleting the given BigQuery tables.
bigExport.deleteBigQueryTables('firestore', [
'payments',
'profiles',
'ratings',
'users'
])
.then(res => {
console.log(res)
})
.catch(error => {
console.error(error)
})Limitations
- Your Firestore data model should be consistent. If a property of documents in the same collection have different data types, you'll get errors.
- Patching existing BigQuery sets isn't supported (yet). To refresh your datasets, you can
deleteBigQueryTables(), thencreateBigQueryTables()and thencopyCollectionsToBigQuery(). - Changed your Firestore data model? Delete the corresponding BigQuery table and run
createBigQueryTables()to create a table with a new schema. - When running this package via a Cloud Function, you may experience that your function times out if your Firestore is large. You can then:
- Increase the timeout for your Cloud Function in the Google Cloud Platform Cloud Function Console.
- Run your function locally, using
firebase serve --only functions.
Issues
Please use the issue tracker.