Export and Import Database
Export / Import IndexedDB <—> Blob
The npm package dexie-export-import extends Dexie with the capability to export and import databases to and from Blobs.
Install
npm install dexie
npm install dexie-export-import
Usage
Here’s the basic usage. There’s a lot you can do by supplying optional [options]
arguments. The available options are described later on in this README (See Typescript interfaces below).
import Dexie from "dexie";
import {importDB, exportDB, importInto, peakImportFile} from "dexie-export-import";
//
// --- importDB() ---
//
// Import from Blob or File to Dexie instance:
//
const db = await importDB(blob, [options]);
//
// --- exportDB() ---
//
// Export to Blob
//
const blob = await exportDB(db, [options]);
//
// --- importInto() ---
//
// Import from Blob or File to existing Dexie instance
//
await importInto(db, blob, [options]);
//
// --- peakImportFile() ---
//
// If you need to peek the metadata from the import file without actually
// performing any import operation
// (since v1.0.0)
//
const importMeta = await peakImportFile(blob);
assert.areEqual(importMeta.formatName, "dexie");
assert.isTrue(importMeta.formatVersion === 1);
console.log("Database name:", importMeta.data.databaseName);
console.log("Database version:", importMeta.data.databaseVersion);
console.log("Database version:", importMeta.data.databaseVersion);
console.log("Tables:", importMeta.data.tables.map(t =>
`${t.name} (${t.rowCount} rows)`
).join('\n\t'));
Note that you can also import the package as follows:
NOTE: Typescript users using dexie@2.x will get compilation errors if using the static import method Dexie.import()
.
import Dexie from "dexie";
import "dexie-export-import";
//
// --- Dexie.import() ---
//
// Import from Blob or File to Dexie instance:
//
const db = await Dexie.import(blob, [options]); // equivalent to importDB()
//
// --- db.export() ---
//
// Export to Blob
//
const blob = await db.export([options]); // equivalent to exportDB()
//
// --- db.import() ---
//
// Import from Blob or File to existing Dexie instance
//
await db.import(blob, [options]); // equivalent to importInto()
Sample
Here’s a working sample on CodePen. It uses downloadjs to deliver the blob as a “file download” to the user. For receiving an import file, it uses a drop area where you can drop your JSON file. Click the Console tab in the bottom to see what progressCallbacks receive.
Even though this sample doesn’t show it, blobs can also be sent or retrieved to/from a server, using the fetch API.
Here’s a blog article on how to export IndexedDB from DevTools on an arbitrary web page or web app, by dynamically including dexie and dexie-export-import in the devtools console.
Features
- Export of IndexedDB Database to JSON Blob.
- Import from Blob back to IndexedDB Database.
- An import Blob can be retrieved from an URL (using fetch()) or from a user-input file (dropped or browsed to).
- An export Blob can be either given end-user to be stored in Downloaded Files, or be send to a server over HTTP(S) using fetch().
- Chunk-wise / Streaming - does not read the entire DB into RAM
- Progress callback (typically for showing progress bar)
- Optional filter allows to import/export subset of data
- Optional transform allows to alter or migrate data being imported
- Support for all structured cloneable exotic types (Date, ArrayBuffer, Blob, etc) except CryptoKeys (which by design cannot be exported)
- Atomic - import / export within one database transaction (optional)
- Export speed: Using getAll() in chunks rather than openCursor().
- Import speed: Using bulkPut() in chunks rather than put().
- Can well be run from a Web Worker (better speed + doesn’t lock GUI).
- Can also export IndexedDB databases that was not created with Dexie.
Compatibility
Product | Supported versions |
---|---|
dexie | ^2.0.4, ^3.x, ^4.x |
Safari | ^10.1 |
IE | 11 |
Edge | any version |
Chrome | any version |
FF | any version |
Similar Libraries
indexeddb-export-import
Much smaller in size, but also much lighter than dexie-export-import.
Indexeddb-export-import can be better choice if:
- your data contains no Dates, ArrayBuffers, TypedArrays or Blobs (only objects, strings, numbers, booleans and arrays).
- your database is small enough to fit in RAM on your target devices.
Dexie-export-import was build to scale when exporting large databases without consuming much RAM. It does also support importing/exporting exotic types.
Interface
Importing this module will extend Dexie and Dexie.prototype as follows. Even though this is conceptually a Dexie.js addon, there is no addon instance. Extended interface is done into Dexie and Dexie.prototype as a side effect when importing the module.
//
// Extend Dexie interface (typescript-wise)
//
declare module 'dexie' {
// Extend methods on db
interface Dexie {
export(options?: ExportOptions): Promise<Blob>;
import(blob: Blob, options?: ImportOptions): Promise<void>;
}
interface DexieConstructor {
import(blob: Blob, options?: StaticImportOptions): Promise<Dexie>;
}
}
StaticImportOptions and ImportOptions
These are the interfaces of the options
optional arguments to Dexie.import() and Dexie.prototype.import(). All options are optional and defaults to undefined (falsy).
export interface StaticImportOptions {
noTransaction?: boolean;
chunkSizeBytes?: number; // Default: DEFAULT_KILOBYTES_PER_CHUNK ( 1MB )
filter?: (table: string, value: any, key?: any) => boolean;
transform?: (table: string, value: any, key?: any) => ({value: any, key?: any});
progressCallback?: (progress: ImportProgress) => boolean;
}
export interface ImportOptions extends StaticImportOptions {
acceptMissingTables?: boolean;
acceptVersionDiff?: boolean;
acceptNameDiff?: boolean;
acceptChangedPrimaryKey?: boolean;
overwriteValues?: boolean;
clearTablesBeforeImport?: boolean;
noTransaction?: boolean;
chunkSizeBytes?: number; // Default: DEFAULT_KILOBYTES_PER_CHUNK ( 1MB )
filter?: (table: string, value: any, key?: any) => boolean;
transform?: (table: string, value: any, key?: any) => ({value: any, key?: any});
progressCallback?: (progress: ImportProgress) => boolean;
}
ImportProgress
This is the interface sent to the progressCallback.
export interface ImportProgress {
totalTables: number;
completedTables: number;
totalRows: number;
completedRows: number;
done: boolean;
}
ExportOptions
This is the interface of the options
optional arguments to Dexie.prototype.export(). All options are optional and defaults to undefined (falsy).
export interface ExportOptions {
noTransaction?: boolean;
numRowsPerChunk?: number;
prettyJson?: boolean;
filter?: (table: string, value: any, key?: any) => boolean;
progressCallback?: (progress: ExportProgress) => boolean;
}
ExportProgress
This is the interface sent to the ExportOptions.progressCallback.
export interface ExportProgress {
totalTables: number;
completedTables: number;
totalRows: number;
completedRows: number;
done: boolean;
}
Defaults
These are the default chunk sizes used when not specified in the options object. We allow quite large chunks, but still not that large (1MB RAM is not much even for a small device).
const DEFAULT_KILOBYTES_PER_CHUNK = 1024; // When importing blob
const DEFAULT_ROWS_PER_CHUNK = 2000; // When exporting db
JSON Format
The JSON format is described in the Typescript interface below. This JSON format is streamable as it is generated in a streaming fashion, and imported also using a streaming fashion. Therefore, it is important that the data come last in the file.
export interface DexieExportJsonStructure {
formatName: 'dexie';
formatVersion: 1;
data: {
databaseName: string;
databaseVersion: number;
tables: Array<{
name: string;
schema: string; // '++id,name,age'
rowCount: number;
}>;
data: Array<{ // This property must be last (for streaming purpose)
tableName: string;
inbound: boolean;
rows: any[]; // This property must be last (for streaming purpose)
}>;
}
}
Example JSON File
{
"formatName": "dexie",
"formatVersion": 1,
"data": {
"databaseName": "dexie-export-import-basic-tests",
"databaseVersion": 1,
"tables": [
{
"name": "outbound",
"schema": "",
"rowCount": 2
},
{
"name": "inbound",
"schema": "++id",
"rowCount": 3
}
],
"data": [{
"tableName": "outbound",
"inbound": false,
"rows": [
[
1,
{
"foo": "bar"
}
],
[
2,
{
"bar": "foo"
}
]
]
},{
"tableName": "inbound",
"inbound": true,
"rows": [
{
"id": 1,
"date": 1,
"fullBlob": {
"type": "",
"data": "AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8gISIjJCUmJygpKissLS4vMDEyMzQ1Njc4OTo7PD0+P0BBQkNERUZHSElKS0xNTk9QUVJTVFVWV1hZWltcXV5fYGFiY2RlZmdoaWprbG1ub3BxcnN0dXZ3eHl6e3x9fn+AgYKDhIWGh4iJiouMjY6PkJGSk5SVlpeYmZqbnJ2en6ChoqOkpaanqKmqq6ytrq+wsbKztLW2t7i5uru8vb6/wMHCw8TFxsfIycrLzM3Oz9DR0tPU1dbX2Nna29zd3t/g4eLj5OXm5+jp6uvs7e7v8PHy8/T19vf4+fr7/P3+/w=="
},
"binary": {
"buffer": "AQID",
"byteOffset": 0,
"length": 3
},
"text": "foo",
"bool": false,
"$types": {
"date": "date",
"fullBlob": "blob2",
"binary": "uint8array2",
"binary.buffer": "arraybuffer"
}
},
{
"id": 2,
"foo": "bar"
},
{
"id": 3,
"bar": "foo"
}
]
}]
}
Exporting IndexedDB Databases that wasn’t generated with Dexie
As Dexie can dynamically open non-Dexie IndexedDB databases, this is not an issue. Sample provided here:
import Dexie from 'dexie';
import {importDB, exportDB} from "dexie-export-import";
async function exportDatabase(databaseName) {
// Open an arbitrary IndexedDB database:
const db = await new Dexie(databaseName).open();
// Export it
const blob = await exportDB(db);
return blob;
}
async function importDatabase(file) {
// Import a file into a Dexie instance:
const db = await importDB(file);
return db.backendDB(); // backendDB() gives you the native IDBDatabase object.
}
Background / Why
This feature has been asked for a lot:
- https://github.com/dexie/Dexie.js/issues/391
- https://github.com/dexie/Dexie.js/issues/99
- https://stackoverflow.com/questions/46025699/dumping-indexeddb-data
My simple answer initially was this:
function export(db) {
return db.transaction('r', db.tables, ()=>{
return Promise.all(
db.tables.map(table => table.toArray()
.then(rows => ({table: table.name, rows: rows})));
});
}
function import(data, db) {
return db.transaction('rw', db.tables, () => {
return Promise.all(data.map (t =>
db.table(t.table).clear()
.then(()=>db.table(t.table).bulkAdd(t.rows)));
});
}
Looks simple!
But:
- The whole database has to fit in RAM. Can be issue on small devices.
- If using JSON.stringify() / JSON.parse() on the data, we won’t support exotic types (Dates, Blobs, ArrayBuffers, etc)
- Not possible to show a progress while importing.
This addon solves these issues, and some more, with the help of some libraries.
Libraries Used
To accomplish a streamable export/import, and allow exotic types, I use the libraries listed below. Note that these libraries are listed as devDependencies because they are bundles using rollupjs - so there’s no real dependency from the library user perspective.
typeson and typeson-registry
These modules enables something similar as JSON.stringify() / JSON.parse() for exotic or custom types.
clarinet
This module allow to read JSON in a streaming fashion
Streaming JSON
I must admit that I had to do some research before I understood how to accomplish streaming JSON from client-side Javascript (both reading / writing). It is really not obvious that this would even be possible. Looking at the Blob interface, it does not provide any way of either reading or writing in a streamable fashion.
What I found though (after some googling) was that it is indeed possible to do that based on the current DOM platform (including IE11 !).
Reading JSON in Chunks
A File or Blob represents something that can lie on a disk file and not yet be in RAM. So how do we read the first 100 bytes from a Blob without reading it all?
const firstPart = blob.slice(0,100);
Ok, and in the next step we use a FileReader to really read this sliced Blob into memory.
const first100Chars = await readBlob(firstPart);
function readBlob(blob: Blob): Promise<string> {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onabort = ev => reject(new Error("file read aborted"));
reader.onerror = ev => reject((ev.target as any).error);
reader.onload = ev => resolve((ev.target as any).result);
reader.readAsText(blob);
});
}
Voila!
But! How can we keep transactions alive when calling this non-indexedDB async call?
I use two different solutions for this:
- If we are in a Worker, I use
new FileReaderSync()
instead ofnew FileReader()
. - If in the main thread, I use
Dexie.waitFor()
to while reading this short elapsed chunk, keeping the transaction alive still.
Ok, fine, but how do we parse the chunk then? Cannot use JSON.parse(firstPart) because it will most definitely be incomplete.
Clarinet to the rescue. This library can read JSON and callback whenever JSON tokens come in.
Writing JSON in Chunks
Writing JSON is solved more easily. As the BlobBuilder interface was deprecated from the DOM, I firstly found this task impossible. But after digging around, I found that this SHOULD be possible if browsers implement the Blob interface correctly.
Blobs can be constructed from an array of other Blobs. This is the key.
- Let’s say we generate 1000 Blobs of 1MB each on a device with 512 MB RAM. If the browser does its job well, it will allow the first 200 blobs or so to reside in RAM. But then, it should start putting the remanding blobs onto temporary files.
- We put all these 1000 blobs into an array and generate a final Blob from that array.
And that’s pretty much it.
Table of Contents
- API Reference
- Access Control in Dexie Cloud
- Add demo users
- Add public data
- Authentication in Dexie Cloud
- Best Practices
- Building Addons
- Collection
- Collection.and()
- Collection.clone()
- Collection.count()
- Collection.delete()
- Collection.desc()
- Collection.distinct()
- Collection.each()
- Collection.eachKey()
- Collection.eachPrimaryKey()
- Collection.eachUniqueKey()
- Collection.filter()
- Collection.first()
- Collection.keys()
- Collection.last()
- Collection.limit()
- Collection.modify()
- Collection.offset()
- Collection.or()
- Collection.primaryKeys()
- Collection.raw()
- Collection.reverse()
- Collection.sortBy()
- Collection.toArray()
- Collection.uniqueKeys()
- Collection.until()
- Compound Index
- Consistency in Dexie Cloud
- Consistent add() operator
- Consistent remove() operator
- Consistent replacePrefix() operator
- Consuming Dexie as a module
- Custom Emails in Dexie Cloud
- DBCore
- DBCoreAddRequest
- DBCoreCountRequest
- DBCoreCursor
- DBCoreDeleteRangeRequest
- DBCoreDeleteRequest
- DBCoreGetManyRequest
- DBCoreGetRequest
- DBCoreIndex
- DBCoreKeyRange
- DBCoreMutateRequest
- DBCoreMutateResponse
- DBCoreOpenCursorRequest
- DBCorePutRequest
- DBCoreQuery
- DBCoreQueryRequest
- DBCoreQueryResponse
- DBCoreRangeType
- DBCoreSchema
- DBCoreTable
- DBCoreTableSchema
- DBCoreTransaction
- DBCoreTransactionMode
- DBPermissionSet
- Deprecations
- Derived Work
- Design
- Dexie Cloud API
- Dexie Cloud API Limits
- Dexie Cloud Best Practices
- Dexie Cloud CLI
- Dexie Cloud Docs
- Dexie Cloud REST API
- Dexie Cloud Web Hooks
- Dexie Constructor
- Dexie.AbortError
- Dexie.BulkError
- Dexie.ConstraintError
- Dexie.DataCloneError
- Dexie.DataError
- Dexie.DatabaseClosedError
- Dexie.IncompatiblePromiseError
- Dexie.InternalError
- Dexie.InvalidAccessError
- Dexie.InvalidArgumentError
- Dexie.InvalidStateError
- Dexie.InvalidTableError
- Dexie.MissingAPIError
- Dexie.ModifyError
- Dexie.NoSuchDatabaseErrorError
- Dexie.NotFoundError
- Dexie.Observable
- Dexie.Observable.DatabaseChange
- Dexie.OpenFailedError
- Dexie.PrematureCommitError
- Dexie.QuotaExceededError
- Dexie.ReadOnlyError
- Dexie.SchemaError
- Dexie.SubTransactionError
- Dexie.Syncable
- Dexie.Syncable.IDatabaseChange
- Dexie.Syncable.IPersistentContext
- Dexie.Syncable.ISyncProtocol
- Dexie.Syncable.StatusTexts
- Dexie.Syncable.Statuses
- Dexie.Syncable.registerSyncProtocol()
- Dexie.TimeoutError
- Dexie.TransactionInactiveError
- Dexie.UnknownError
- Dexie.UnsupportedError
- Dexie.UpgradeError()
- Dexie.VersionChangeError
- Dexie.VersionError
- Dexie.[table]
- Dexie.addons
- Dexie.async()
- Dexie.backendDB()
- Dexie.close()
- Dexie.currentTransaction
- Dexie.debug
- Dexie.deepClone()
- Dexie.defineClass()
- Dexie.delByKeyPath()
- Dexie.delete()
- Dexie.derive()
- Dexie.events()
- Dexie.exists()
- Dexie.extend()
- Dexie.fakeAutoComplete()
- Dexie.getByKeyPath()
- Dexie.getDatabaseNames()
- Dexie.hasFailed()
- Dexie.ignoreTransaction()
- Dexie.isOpen()
- Dexie.js
- Dexie.name
- Dexie.on()
- Dexie.on.blocked
- Dexie.on.close
- Dexie.on.error
- Dexie.on.populate
- Dexie.on.populate-(old-version)
- Dexie.on.ready
- Dexie.on.storagemutated
- Dexie.on.versionchange
- Dexie.open()
- Dexie.override()
- Dexie.semVer
- Dexie.setByKeyPath()
- Dexie.shallowClone()
- Dexie.spawn()
- Dexie.table()
- Dexie.tables
- Dexie.transaction()
- Dexie.transaction()-(old-version)
- Dexie.use()
- Dexie.verno
- Dexie.version
- Dexie.version()
- Dexie.vip()
- Dexie.waitFor()
- DexieCloudOptions
- DexieError
- Docs Home
- Download
- EntityTable
- Export and Import Database
- Get started with Dexie in Angular
- Get started with Dexie in React
- Get started with Dexie in Svelte
- Get started with Dexie in Vue
- Hello World
- How To Use the StorageManager API
- Inbound
- IndexSpec
- Indexable Type
- IndexedDB on Safari
- Invite
- Member
- Migrating existing DB to Dexie
- MultiEntry Index
- PersistedSyncState
- Privacy Policy
- Promise
- Promise.PSD
- Promise.catch()
- Promise.finally()
- Promise.on.error
- Promise.onuncatched
- Questions and Answers
- Realm
- Releasing Dexie
- Road Map
- Road Map: Dexie 5.0
- Road Map: Dexie Cloud
- Role
- Run Dexie Cloud on Own Servers
- Sharding and Scalability
- Simplify with yield
- Support Ukraine
- SyncState
- Table
- Table Schema
- Table.add()
- Table.bulkAdd()
- Table.bulkDelete()
- Table.bulkGet()
- Table.bulkPut()
- Table.bulkUpdate()
- Table.clear()
- Table.count()
- Table.defineClass()
- Table.delete()
- Table.each()
- Table.filter()
- Table.get()
- Table.hook('creating')
- Table.hook('deleting')
- Table.hook('reading')
- Table.hook('updating')
- Table.limit()
- Table.mapToClass()
- Table.name
- Table.offset()
- Table.orderBy()
- Table.put()
- Table.reverse()
- Table.schema
- Table.toArray()
- Table.toCollection()
- Table.update()
- Table.where()
- The main limitations of IndexedDB
- Transaction
- Transaction.abort()
- Transaction.on.abort
- Transaction.on.complete
- Transaction.on.error
- Transaction.table()
- Tutorial
- Typescript
- Typescript (old)
- Understanding the basics
- UserLogin
- Version
- Version.stores()
- Version.upgrade()
- WhereClause
- WhereClause.above()
- WhereClause.aboveOrEqual()
- WhereClause.anyOf()
- WhereClause.anyOfIgnoreCase()
- WhereClause.below()
- WhereClause.belowOrEqual()
- WhereClause.between()
- WhereClause.equals()
- WhereClause.equalsIgnoreCase()
- WhereClause.inAnyRange()
- WhereClause.noneOf()
- WhereClause.notEqual()
- WhereClause.startsWith()
- WhereClause.startsWithAnyOf()
- WhereClause.startsWithAnyOfIgnoreCase()
- WhereClause.startsWithIgnoreCase()
- db.cloud.configure()
- db.cloud.currentUser
- db.cloud.currentUserId
- db.cloud.events.syncComplete
- db.cloud.invites
- db.cloud.login()
- db.cloud.logout()
- db.cloud.options
- db.cloud.permissions()
- db.cloud.persistedSyncState
- db.cloud.roles
- db.cloud.schema
- db.cloud.sync()
- db.cloud.syncState
- db.cloud.userInteraction
- db.cloud.usingServiceWorker
- db.cloud.version
- db.cloud.webSocketStatus
- db.members
- db.realms
- db.roles
- db.syncable.connect()
- db.syncable.delete()
- db.syncable.disconnect()
- db.syncable.getOptions()
- db.syncable.getStatus()
- db.syncable.list()
- db.syncable.on('statusChanged')
- db.syncable.setFilter()
- dexie-cloud-addon
- dexie-react-hooks
- liveQuery()
- unhandledrejection-event
- useLiveQuery()
- useObservable()
- usePermissions()