Bring Deep Learning to small devices

An open source deep learning platform for low bit computation

Key features

  1. Small DNN

    Compress deep learning models while maintaining accuracy. Additionally the weight and activation are quantized to just 1 or 2 bit.

  2. Low end devices

    specially designed circuits for deep learning on FPGA devices, which are faster than CPU and use much less power than GPU.

  3. Battery included

    All things from model design, quantization, and synthesized circuits for hardware implementation, including FPGA-friendly network architecture, are ready to be used.

What is Blueoil?

Blueoil is a software stack dedicated to neural networks. It includes special training method for quantization and original networks designed to be highly compatible with FPGA devices. This is done in order to operate a neural network at high speed on a low-powered FPGA. New models can be trained easily by only preparing data. The finished model can be converted into a binary file that runs on FPGA or CPU devices with a single command.