Download the PHP package yavor-ivanov/csv-importer without Composer

On this page you can find all versions of the php package yavor-ivanov/csv-importer. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package csv-importer

About Laravel CSV Importer

A Laravel 4 library for importing/exporting CSV files into/out of database tables using Eloquent.

Note: This library is still early in development. You may find functionality which is inflexible, altogether, poorly documented. You are welcome to open issues, but my schedule may prevent me from reacting to them in a timely fashion. Nevertheless, I've open sourced it, as none of the Laravel CSV import/export packages could handle cross-CSV references.

Features

Requirements

Installation

To install, run: in your laravel project root.

Alternatively, you can manually add to your composer.json under the require field.

Then, open app/config/app.php and add the following line in the providers array:

The package is configured by default to look for:

You'll need to create all these folders by running these commands:

Examples

I've set up an examples repository containing a Laravel 4 application with multiple importers and exporters.

Configuration

The importer package comes preconfigured to look for CSV files and importer/exporter scripts in the app/csv/files/ and app/csv/importer (and exporter) folders. If you wish to change this, you'll need to first get your own (application) copy of the configuration by running the following command:

You will get a copy of the default configuration in app/config/packages/yavor-ivanov/csv-importer/config.php. Any changes to this file override the default configuration.

This is what the default config file looks like:

The configuration options are ass follows:

Naming conventions

The package establishes the following file and class naming convention for the importer/exporters:

In exchange for these limitation, the package is able to automatically register any importers/exporters that follow the convention. This allows you to:

Usage

Commands

The package comes with two commands csv:import and csv:export

The command format is:

Example usage: or

The importer supports the following modes:

The mode parameter is optional. The importer always runs in append mode if one is not supplied.

NOTE: As of the moment, the package doesn't support deleting records from the database when removed from a CSV file, as this is potentially error prone. A prune mode will be added in later versions that exclusively performs this operation. Right now, the only way to delete database records is with the overwrite option.

For more information on the command format, run php artisan help csv:import

CSV format

The only requirements for the CSV format are:

  1. The CSV must include a header row
  2. All columns in the header must be named
  3. There must be at least one unique column
  4. Cells which contain spaces must be quoted in the CSV output

An example of a valid CSV:

id role_name
1 super admin
2 admin
3 moderator
4 user

(Table view)

Database table format

id role_name csv_id
4 super admin 1
5 admin 2
6 moderator 3
7 user 4

In order to make use of caching, update mode, and cross-CSV references, the database table must include a csv_id column. This is used to find the database record for the corresponding CSV row.

Model

The package uses Laravel's Eloquent ORM to read and write to the database. This means that there is no configuration needed for database access. This also allows you to use Eloquent features (such as observers, validators, custom properties, etc.) while importing and exporting.

In order to make sure the csv_id column autoincrements when creating new records outside the importer, you must use the CSVReferenceTrait in your model like so:

The CSVReferenceTrait registers a save hook, which sets the proper csv_id for models saved without one.

NOTE: You can import to properties not listed in the $fillable array, as the importer turns off the Eloquent field guarding while importing. (Don't worry, it re-guards them when it's done.)

Importers

Minimum configuration importer example

The following is the minimum configuration needed to create an importer class, which creates a collection of UserRole models and imports them to the database, and supports update mode:

Alternatively, you can forego the $column_mapping array, and define the import and update functions yourself (you can even mix them together):

Every importer name must end with Importer (controlled by the class_match_pattern property) and extend the CSVImporter base class.

Field breakdown:

Column mappings

Often, the database table columns differ in name (or representation) from the CSV files you wish to import. For example, databases accept dates in the ISO 8601 format (YYY-MM-DD), but your CSV files may contain dates in the American date format (MM/DD/YYYY). The $column_mapping property allows you to define any name difference between CSV and database table, as well as transform and validate the CSV data before saving.

The $column_mapping format is flexible. It allows you to define a column with a name difference, multiple preprocessing steps, as well as validation functions:

The $column_mapping is also used by the importer when no import_row or update function is defined. It performs the import by running the validator/processor functions on the csv_column and saving the result to the specified table_column. The table_column could also be a model property or model function:

Note: In order to assign to model properties, the importer uses the table_column key as part of an eval() call. The call is limited to the current model instance, yet there are no checks for malicious intent, such as calling delete() or using id; call_malicious_function(); $variable_name as a key.

When the importer reads the CSV file, it looks at the $column_mapping to determine if it should transform the input data (or run validations against it).

The processor function are read from the result of the get_processors function:

The function names of the preprocessors are determined by the array keys returned.

The validators are returned by the get_validators function.

For brevity, you can omit unused features from your column specification. For example, importing a column with only a name change can be condensed to the following:

Another example of this is using preprocessors without passing in parameters:

Or omitting the array if there is only one processor:

Adding dependencies and referencing an importer

Often times, the data in CSV files wants to be relational in nature. In order to resolve the relationships properly, the importer needs to import files in the correct order. For instance, if the users.csv makes a reference to a phone number from phone_numbers.csv, the importer should make sure to import the phone numbers before it imports the users, as well as fetch any phone numbers that may be in the database (but not in the phone_numbers.csv file).

The package evaluates the import order by reading the dependencies of each importer defined in the static $deps property. This collection of dependencies forms a dependency graph that can be topologically sorted to yield order in which the package must call the importers.

Whenever you run an importer that has dependencies from the command line, you will see progress bars for the importer and its dependencies:

Here, the Book importer depends on the Author importer. As of the moment, the progress bars are not labeled.

Aside from declaring dependencies, an importer will need to access the data from the entities of its dependencies. In the Books and Authors example, a Books importer may wish to find its Author's id and use it as a foreign key.

The get_from_context function returns an Eloquent model instance from a dependency name and a search value: protected function get_from_context($ctx, $key)

The column on which to select on is determined by the $primary_key property of the dependent importer. If no $primary_key is defined, the $cache_key mapping is used instead. This allows you to cache a model by one column (most frequently id), yet refer to if by another (unique) column from other importers. An example of this is caching a User model by id, but referring to it by email from an Order.

Caching

In the beginning of the import, the package selects all rows from the importer's $model table, and caches them by a unique CSV column in the $cache_key mapping:

protected $cache_key = ['table_column_name' => 'csv_column_name'];

This allows the package to avoid importing CSV rows that are already in the database. Each CSV record can be checked for inclusion in the database by comparing the values of the unique keys defined in the $cache_key mapping. For example, a mapping of id <--> csv_id would search the cache for an Eloquent entity with an id equal to the current CSV row's csv_id.

Aside from the performance benefit of caching, you can query the cache in importers by calling the 'get_from_cache($hash)'. This is useful when importing self-referential CSVs. An example of this would be importing a tree structure in following format: [id, name, parent_id], where each branch prepends its parent's name to its own.

Adding preprocessor functions

You can define processor functions for your importers by adding a get_processors() function, which returns an array of functions. The package will run these functions on the columns which list these processors in the the $column_mapping.

You can define the processor functions inline:

Or, if you wish to share them across importers, you can define them in a shared file, and reference them:

The base importer also defines its own processors that can be used by all importers, as they are 'inherited'.

Note: As of the moment there is no way to register global preprocessors like the base importer does.

Adding validation functions

Validation functions are defined similarly to the preprocessors. The get_validators function of the importer returns an array of functions that can be defined to run when an importer runs via the $column_mapping property.

Below is an example validator which checks a column for uniqueness:

The validator functions receive both the column name they have been called on and the whole CSV row. This allows you to create validations rules such as: property X is valid only if Y is NULL.

Column pivoting

Sometimes your pivot CSV table nicely mirrors the database pivot table:

book_id genre_id
1 2
1 1

Other times, your many to many pivot CSV may come in a weird, multiple column format:

book_id genre1 genre2 ...
1 2 1

Although changing the CSV format to be in tune with the database table layout would be ideal, you may not always have that luxury (widely used legacy formats, for instance).

When this happens, you can do some preprocessing of the individual rows before the import/update function reads them.

In the example below, the pivot_row function takes a row in the [book_id, genre1, genre2, ... genreN] format and replaces it with multiple rows of [book_id, genre_id] tuples:

The pivot step is run before column processors and validators.

Exporters

Minimum configuration exporter example

The following is a minimal exporter example using the $column_mapping to drive the CSV export:

Note: Every exporter class name must end with Exporter (controlled by the class_match_pattern property) and extend the CSVExporter base class.

Field breakdown:

Column mappings

The exporter $column_mapping property serves the same purpose as the importer $column_mapping. It allows you to define a relationship between the table columns (or model properties/methods) and CSV columns. The exporter uses this mapping to generate the CSV output.

Like the importer, the exporter $column_mapping allows you to use model properties, as well as map model functions to CSV columns. Here's an example of a mappnig between a model property and a CSV column that uses a postprocessing function:

And an example of an exporter mapping a model function to a CSV column:

Exporting is generally more straightforward than importing, so there's no need to use an export_row like function (although the option is available). The exporter can read the $column_mapping and automatically outputs the CSV file.

In order to support certain mappings, the exporter evaluates the key of the $column_mapping entry (model_property in the above example) and uses the result as the value of csv_column when exporting. Some examples of such mappings are: computed properties, aggregate functions, and relationship properties.

The book exporter example uses such a mapping to get the csv id of its authors relationship:

Note: The exporter uses the $column_mapping key as part of an eval() call. The call is limited to the current model instance, yet there are no checks for malicious intent, such as calling delete() or using id; call_malicious_function() as a key.

Like the importer $column_mapping property, the exporter allows you to simplify the row declaration if you don't need to use a postprocessor, or the CSV and database columns coincide:

Post-processing

Like the importer, the package reads post-processor functions from the result of the get_processors function:

The function names of the post-processors are determined by the array keys returned from get_processors.

Generating rows programatically

Sometimes, the column mappings just aren't flexible enough to handle your export logic. In such cases, you can use the generate_row function to generate the rows programatically. Cases where you may want to do this include: exporting CSV files with a variable number of columns, exporting CSVs with data from multiple models, exporting data external to the model, etc.

Once the export process is started, the exporter selects all records from the $model entity, and calls generate_row on each record. The function should return an array in the following format: ['csv_column1' => 'value', 'csv_column2' => 'other_vaule'].

The following is an example of exporting a CSV with a variable number of columns. You can see the full code in the examples:

Note: If you choose to override this function, the exporter will not process the $column_mapping property, unless you call parent::generate_row(). This allows you to mix custom row generation logic with column mappings if you want (or skip the automatic mapping entirely).

Licensed under the MIT license

The project license file can be found here.


All versions of csv-importer with dependencies

PHP Build Version
Package Version
Requires php Version >=5.4.0
illuminate/support Version 4.2.*
mnshankar/csv Version 1.8
marcj/topsort Version ~0.1
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package yavor-ivanov/csv-importer contains the following files

Loading the files please wait ....