Download the PHP package numpower/autograd without Composer

On this page you can find all versions of the php package numpower/autograd. It is possible to download/install these versions without Composer. Possible dependencies are resolved automatically.

FAQ

After the download, you have to make one include require_once('vendor/autoload.php');. After that you have to import the classes with use statements.

Example:
If you use only one package a project is not needed. But if you use more then one package, without a project it is not possible to import the classes with use statements.

In general, it is recommended to use always a project to download your libraries. In an application normally there is more than one library needed.
Some PHP packages are not free to download and because of that hosted in private repositories. In this case some credentials are needed to access such packages. Please use the auth.json textarea to insert credentials, if a package is coming from a private repository. You can look here for more information.

  • Some hosting areas are not accessible by a terminal or SSH. Then it is not possible to use Composer.
  • To use Composer is sometimes complicated. Especially for beginners.
  • Composer needs much resources. Sometimes they are not available on a simple webspace.
  • If you are using private repositories you don't need to share your credentials. You can set up everything on our site and then you provide a simple download link to your team member.
  • Simplify your Composer build process. Use our own command line tool to download the vendor folder as binary. This makes your build process faster and you don't need to expose your credentials for private repositories.
Please rate this library. Is it a good library?

Informations about the package autograd

NumPower Autograd

NumPower Autograd enriches the NumPower extension by enabling automatic differentiation through reverse-mode (backpropagation) automatic differentiation. Automatic differentiation is a computational technique used to efficiently and accurately compute derivatives of functions. Unlike numerical differentiation, which can be prone to errors and inefficiencies, automatic differentiation systematically applies the chain rule to compute gradients.

Reverse-mode automatic differentiation, commonly known as backpropagation, is particularly powerful in machine learning and optimization tasks. It calculates gradients of scalar functions with respect to N-dimensional arguments in a highly efficient manner, making it ideal for training complex models.

NumPower Autograd also supports GPU acceleration, leveraging the power of graphics processing units for faster computation. This capability enhances the library's performance, particularly for large-scale mathematical and machine learning tasks.

By integrating backpropagation and GPU support, NumPower Autograd provides high-performance gradient computation capabilities, facilitating advanced mathematical and machine learning applications.


Requirements

Installing

Getting Started

The NumPower\Tensor class is a type of N-dimensional array with support for automatic differentiation. It uses the NDArray object from the NumPower extension as its engine, retaining many of the flexibilities of using an NDArray, such as using it as an operator in arimetic operations, simplifying code reading.

This code demonstrates a simple example of automatic differentiation using variables and operations. It computes the gradients of the result of a computation ($c) with respect to its inputs ($a and $b).

Using a video card with CUDA support

If you have compiled the NumPower extension with GPU utilization capabilities, you can perform operations on the GPU by allocating your Tensor in VRAM. The operations will automatically identify whether your Tensor is in RAM or VRAM and will perform the appropriate operation on the correct device.

Tensor works exactly the same as NDArray when dealing with different devices in multi-argument operations. Unless the argument is a scalar, all N-dimensional arguments of an operation must be stored on the same device.

Simple training with autograd

Here we can see a more practical example. Let's create a neural network with a hidden layer and 1 output layer. Some common Neural Network functions are ready-made and can be accessed statically through the NumPower\NeuralNetwork module.

If you want to know more about each step of creating this model, access our documentation and see step-by-step instructions on how to build this model.

The above model can be used for several different classification problems. For simplicity, let's see if our model can solve the XOR problem.


All versions of autograd with dependencies

PHP Build Version
Package Version
Requires php Version >=8.3
ext-numpower Version >=0.5.0
ext-gd Version *
Composer command for our command line client (download client) This client runs in each environment. You don't need a specific PHP version etc. The first 20 API calls are free. Standard composer command

The package numpower/autograd contains the following files

Loading the files please wait ....