An open source deep learning platform for low bit computation
Compress deep learning models while maintaining accuracy. Additionally the weight and activation are quantized to just 1 or 2 bit.
specially designed circuits for deep learning on FPGA devices, which are faster than CPU and use much less power than GPU.
All things from model design, quantization, and synthesized circuits for hardware implementation, including FPGA-friendly network architecture, are ready to be used.
Blueoil is a software stack dedicated to neural networks. It includes special training method for quantization and original networks designed to be highly compatible with FPGA devices. This is done in order to operate a neural network at high speed on a low-powered FPGA. New models can be trained easily by only preparing data. The finished model can be converted into a binary file that runs on FPGA or CPU devices with a single command.
The Blueoil project is an open source project. We welcome contributions, ideas, and opinions. Please feel free to join us on GitHub.
Stay up to date with the latest Blueoil news.
Follow us on social media