Download the PHP package keboola/db-import-export without Composer

On this page you can find all versions of the php package keboola/db-import-export. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package db-import-export

DB Import export library

Supported operations

Features

Import

Export

Development

Docker

Prepare .env (copy of .env.dist) and set up AWS keys which has access to keboola-drivers bucket in order to build this image. Also add this user to group ci-php-import-export-lib witch will allow you to work with newly created bucket for tests.

User can be created in Dev - Main legacy, where are also groups for keboola-drivers and ci-php-import-export-lib.

If you don't have access to keboola-drivers you have to change Dockerfile.

Then run docker compose build

The AWS credentials have to also have access to bucket specified in AWS_S3_BUCKET. This bucket has to contain testing data. Run docker compose run --rm dev composer loadS3 to load them up.

Preparation

Azure

Google cloud storage

SNOWFLAKE

Role, user, database and warehouse are required for tests. You can create them:

SYNAPSE

Create synapse server on Azure portal or using CLI.

set up env variables: SYNAPSE_UID SYNAPSE_PWD SYNAPSE_DATABASE SYNAPSE_SERVER

Run query:

this will create master key for polybase.

Managed Identity

Managed Identity is required when using ABS in vnet. docs How to setup and use Managed Identity is described in docs

TLDR; In IAM of ABS add role assignment "Blob Storage Data {Reader or Contributor}" to your Synapse server principal

Exasol

You can run Exasol locally in Docker or you can use SaaS.

Exasol locally in Docker

Run Exasol on your local machine in docker (for this case .env is preconfigured)

Run Exasol server somewhere else and set up env variables:

Exasol in SaaS

Login to SaaS UI (or use a local client) and create user with following grants.

Obtain host (with port), username and password (from previous step) and fill it in .env as desribed above. Make sure, that your account has enabled network for your IP.

Teradata

Prepare Teradata servers on AWS/Azure and set following properties. See

create new database for tests:

Bigquery

Install Google Cloud client (via Brew), initialize it and log in to generate default credentials.

To prepare the backend you can use Terraform template. You must have the resourcemanager.folders.create permission for the organization.

Run terraform apply with following variables:

For missing pieces see Connection repository. After terraform apply ends go to the service project in folder created by terraform.

  1. convert key to string and save to .env file: awk -v RS= '{$1=$1}1' <key_file>.json >> .env
  2. set content on the last line of .env as variable BQ_KEY_FILE
  3. set env variable BQ_BUCKET_NAME generated from TF template file_storage_bucket_id

Tests

Run tests with following command.

note: azure credentials must be provided and fixtures uploaded

Unit and functional test can be run sepparetly

Code quality check

Full CI workflow

This command will run all checks load fixtures and run tests

Usage

Snowflake

ABS -> Snowflake import/load

Snowflake -> Snowflake copy

Snowflake -> ABS export/unload

Synapse next (experimental)

Import to Synapse

Internals/Extending

Library consists of few simple interfaces.

Create new backend

Importer, Exporter Interface must be implemented in new Backed

For each backend there is corresponding adapter which supports own combination of SourceInterface and DestinationInterface. Custom adapters can be set with setAdapters method.

Create new storage

Storage is now file storage ABS|S3 (in future) or table storage Snowflake|Synapse. Storage can have Source and Destination which must implement SourceInterface or DestinationInterface. These interfaces are empty and it's up to adapter to support own combination. In general there is one Import/Export adapter per FileStorage <=> TableStorage combination.

Adapter must implement:

Backend can require own extended AdapterInterface (Synapse and Snowflake do now).

License

MIT licensed, see LICENSE file.


All versions of db-import-export with dependencies

PHP Build Version
Package Version
Requires php Version ^8.1
ext-json Version *
ext-pdo Version *
doctrine/dbal Version ^3.3
google/cloud-bigquery Version ^1.23
google/cloud-storage Version ^1.27
keboola/csv-options Version ^1
keboola/php-csv-db-import Version ^6
keboola/php-datatypes Version ^7.6
keboola/php-file-storage-utils Version ^0.2.2
keboola/php-temp Version ^2.0
keboola/table-backend-utils Version >=2.7
microsoft/azure-storage-blob Version ^1.4
symfony/process Version ^4.4|^5.0|^6.0
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package keboola/db-import-export contains the following files

Loading the files please wait ....