The Wayback Machine - https://web.archive.org/web/20210512034056/https://github.com/DiffSharp/DiffSharp
Skip to content
dev
Switch branches/tags
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
src
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Documentation

Build Status Coverage Status

This is the development branch of DiffSharp 1.0.

NOTE: This branch is undergoing development. It has incomplete code, functionality, and design that are likely to change without notice.

Getting Started

DiffSharp is normally used from an F# Jupyter notebook. You can simply open examples directly in the browser, e.g.

To use locally in Visual Studio Code:

  • Install .NET Interactive Notebooks for VS Code

  • After opening an .ipynb execute Ctrl-Shift-P for the command palette and chose Reopen Editoer With... then .NET Interactive Notebooks

  • To restart the kernel use restart from the command palette.

To use locally in Jupyter, first install Jupyter and then:

dotnet tool install -g --add-source "https://dotnet.myget.org/F/dotnet-try/api/v3/index.json" microsoft.dotnet-interactive
dotnet interactive jupyter install

When using .NET Interactive it is best to completely turn off automatic HTML displays of outputs:

Formatter.SetPreferredMimeTypeFor(typeof<obj>, "text/plain")
Formatter.Register(fun x writer -> fprintfn writer "%120A" x )

You can also use DiffSharp from a script or an application. Here are some example scripts with appropriate package references:

Available packages and backends

Now reference an appropriate nuget package from https://nuget.org:

For all but DiffSharp-lite add the following to your code:

dsharp.config(backend=Backend.Torch)

Using a pre-installed or self-built LibTorch 1.8.0

The Torch CPU and CUDA packages above are large. If you already have libtorch 1.8.0 available on your machine you can

  1. reference DiffSharp-lite

  2. set LD_LIBRARY_PATH to include a directory containing the relevant torch.so, torch_cpu.so and torch_cuda.so, or execute NativeLibrary.Load on torch.so.

  3. use dsharp.config(backend=Backend.Torch)

Developing DiffSharp Libraries

To develop libraries built on DiffSharp, do the following:

  1. reference DiffSharp.Core and DiffSharp.Data in your library code.

  2. reference DiffSharp.Backends.Reference in your correctness testing code.

  3. reference DiffSharp.Backends.Torch and libtorch-cpu in your CPU testing code.

  4. reference DiffSharp.Backends.Torch and libtorch-cuda-linux or libtorch-cuda-windows in your (optional) GPU testing code.