•Operating Systems
Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit , Windows® 10 64bit
•OpenVINO™ toolkit
◦Intel® Deep Learning Deployment Toolkit
- Model Optimizer
- Inference Engine
◦Optimized computer vision libraries
◦Intel® Media SDK
◦Current Supported Topologies: AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101
* For more topologies support information please refer to Intel® OpenVINO™ Toolkit official website.
◦High flexibility, Mustang-V100-MX4 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, MXNet, and ONNX to execute on it after convert to optimized IR.
Model Name | Mustang-V100-MX4 |
---|---|
Main Chip | 4 x Intel® Movidius™ Myriad™ X MA2485 VPU |
Operating Systems | Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit, Windows® 10 64bit |
Dataplane Interface | PCIe Gen 2 x 2 |
Power Consumption | Approximate 15W |
Operating Temperature | -20°C~65°C (In TANK AIoT Dev. Kit) |
Cooling | Active fan |
Dimensions | 113 x 56 x 23 mm |
Operating Humidity | 5% ~ 90% |
Dip Switch/LED indicator | Identify card number |
Support Topology | AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101 |
Part No. | Description |
---|---|
Mustang-V100-MX4-R10 | Computing Accelerator Card with 4x Intel® Movidius™ Myriad™ X MA2485 VPU, PCIe Gen 2 x 2 interface, RoHS |