Download the PHP package slayerbirden/dataflow without Composer
On this page you can find all versions of the php package slayerbirden/dataflow. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.
Download slayerbirden/dataflow
More information about slayerbirden/dataflow
Files in slayerbirden/dataflow
Package dataflow
Short Description A framework to pipe your data from one source to another.
License MIT
Informations about the package dataflow
DataFlow
Build a pipeline to pour your data through!
About
This is a low level lib that helps you with building a data migration process. It requires a little bootstrapping to float, but it can be easily done if you just need a minimal process or looking to test things out.
The main idea is that you're building a pipeline where each section can mutate the data that's flowing through. It allows process to be extremely flexible, since each new step can potentially get (completely) transformed data. It also can be very dangerous, since mistake can be costly, hence we crafted awesome error handling and reporting mechanisms.
Examples
Build a simple cli import script.
You have a csv file you need to import into database? Let's do that!
-
For the sake of example we'll be importing users into table "users" with columns:
id (int) name (string) email (string, unique) age (int) -
Our csv file looks like this
first last email age Arthur Dayne [email protected] 23 Gerold Hightower [email protected] 43 Eddard Stark [email protected] 36 Jaime Lannister [email protected] 34 Aegon I Targaryen [email protected] 64 -
Let's use PipeLine builder to set up some known steps. We need a little bootstrapping to get DBAL writer and empty emitter.
-
Now initiate the Plumber and pour.
-
We want some reporting to know what's going on. Let's implement basic stdOut emitter.
-
There are something else we need to do: concat firstName and lastName and assign to "name"
- Full file can be found under
examples/example1-import-cli/import.php
. - Logged results:
Goals
- Easy to maintain. Strong reporting shows what exactly went wrong.
- Flexible. Extremely flexible workflow allows to work with different import/export/migration scenarios.
- one file -> multiple tables
- multiple files -> one table
- supports dynamic DB connection
- supports different source types and destination types
- Fast. Low level operations produce maximum flow speed.
- Stable. Strong test coverage guarantees bug-free groundwork.