Key features
- Small DNN
Compress deep learning models while maintaining accuracy. Additionally the weight and activation are quantized to just 1 or 2 bit.
- Low end devices
specially designed circuits for deep learning on FPGA devices, which are faster than CPU and use much less power than GPU.
- Battery included
All things from model design, quantization, and synthesized circuits for hardware implementation, including FPGA-friendly network architecture, are ready to be used.
What is Blueoil?
Blueoil is a software stack dedicated to neural networks. It includes special training method for quantization and original networks designed to be highly compatible with FPGA devices. This is done in order to operate a neural network at high speed on a low-powered FPGA. New models can be trained easily by only preparing data. The finished model can be converted into a binary file that runs on FPGA or CPU devices with a single command.
Community
- GitHub
The Blueoil project is an open source project. We welcome contributions, ideas, and opinions. Please feel free to join us on GitHub.
- Slack
Stay up to date with the latest Blueoil news.
- SNS
Follow us on social media