본문 바로가기 주메뉴 바로가기

Ai Solution

Home > 제품소개 > Ai Solution
처음페이지 1 2 마지막페이지
  • PPC-F17B-BT  [ KC인증 ]

    Overview

    » Support IEI iRIS-2400 remote intelligent solution

    » Intel® Celeron® J1900 quad-core SoC

    » Robust IP65 aluminum front bezel

    » Aesthetic ultra-thin bezel for seamless panel mount installation

    » Projected capacitive multi-touch and resistive single touch options

    » Dual full-size PCIe Mini card expansion

    » 9V~36V DC wide range DC input

    제품정보 바로가기
  • PPC-F10B-BT  [ KC인증 ]

    Overview

    » 5.7”, 8” and 10.4” fanless industrial panel PC

    » 2.0 GHz Intel® Celeron® J1900 quad-core processor or 1.58 GHz Intel® Celeron® N2807 dual-core processor

    » Low power consumption DDR3L memory supported

    » IP 65 compliant front panel

    » 9 V~30 V wide DC input

    » mSATA SSD suppoted

    » Dual GbE for backup

    • » -10°C~50°C extended operating temperature
    제품정보 바로가기
  • Mustang-M2BM-MX2  [ SAMPLE ]

    Overview

    •Operating Systems
    Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit,Windows® 10 64bit
    •OpenVINO™ toolkit
    ◦Intel® Deep Learning Deployment Toolkit
    - Model Optimizer
    - Inference Engine
    ◦Optimized computer vision libraries
    ◦Intel® Media SDK
    •Current Supported Topologies: AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101
    * For more topologies support information please refer to Intel® OpenVINO™ Toolkit official website.
    •High flexibility, Mustang-M2BM-MX2 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, MXNet, and ONNX to execute on it after convert to optimized IR.
     

    제품정보 바로가기
  • Mustang-M2AE-MX1  [ SAMPLE ]

    Overview

    •Operating Systems
    Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit, Windows® 10 64bit
    •OpenVINO™ toolkit
    ◦Intel® Deep Learning Deployment Toolkit
    - Model Optimizer
    - Inference Engine
    ◦Optimized computer vision libraries
    ◦Intel® Media SDK

    •Current Supported Topologies: AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101
    * For more topologies support information please refer to Intel® OpenVINO™ Toolkit official website.
    •High flexibility, Mustang-M2AE-MX1 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, MXNet, and ONNX to execute on it after convert to optimized IR.

    제품정보 바로가기
  • Mustang-MPCIE-MX2  [ KC인증 ]

    Overview

    ● miniPCIe form factor (30 x 50 mm)
    ● 2 x Intel® Movidius™ Myriad™ X VPU MA2485
    ● Power efficiency ,approximate 7.5W
    ● Operating Temperature -20°C~60°C
    ● Powered by Intel’s OpenVINO™ toolkit
    제품정보 바로가기
  • Mustang-V100-MX4  [ KC인증 ]

    Overview

    •Operating Systems
    Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit , Windows® 10 64bit
    •OpenVINO™ toolkit
    ◦Intel® Deep Learning Deployment Toolkit
    - Model Optimizer
    - Inference Engine
    ◦Optimized computer vision libraries
    ◦Intel® Media SDK
    ◦Current Supported Topologies: AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101
    * For more topologies support information please refer to Intel® OpenVINO™ Toolkit official website.
    ◦High flexibility, Mustang-V100-MX4 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, MXNet, and ONNX to execute on it after convert to optimized IR.

     

    제품정보 바로가기
  • Mustang-V100-MX8  [ KC인증 ]  [ 단종 ]

    단종

    Overview

    •Operating Systems
    Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit, Windows 10 64bit.
    •OpenVINO™ Toolkit
    ◦Intel® Deep Learning Deployment Toolkit
    - Model Optimizer
    - Inference Engine
    ◦Optimized computer vision libraries
    ◦Intel® Media SDK
    ◦*OpenCL™ graphics drivers and runtimes.
    ◦Current Supported Topologies: AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101
    - For more topologies support information please refer to Intel® OpenVINO™ Toolkit official website.

    •High flexibility, Mustang-V100-MX8 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, and MXNet to execute on it after convert to optimized IR.
    *OpenCL™ is the trademark of Apple Inc. used by permission by Khronos

    제품정보 바로가기
  • Mustang-F100-A10  [ SAMPLE ]

    Overview

    •Operating Systems
    Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit (Support Windows 10 in the end of 2018 & more OS are coming soon)
    •OpenVINO™ toolkit

    ◦Intel® Deep Learning Deployment Toolkit

    - Model Optimizer
    - Inference Engine

    ◦Optimized computer vision libraries
    ◦Intel® Media SDK
    ◦*OpenCL™ graphics drivers and runtimes.
    ◦Current Supported Topologies: AlexNet, GoogleNet, Tiny Yolo, LeNet, SqueezeNet, VGG16, ResNet (more variants are coming soon)
    ◦Intel® FPGA Deep Learning Acceleration Suite

    •High flexibility, Mustang-F100-A10 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, and MXNet to execute on it after convert to optimized IR.
    *OpenCL™ is the trademark of Apple Inc. used by permission by Khronos

    제품정보 바로가기
  • TANK AIoT Developer Kit  [ SAMPLE ]

    Overview

    Intel® Distribution of OpenVINO™ toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across multiple types of Intel® platforms and maximizes performance.

    It can optimize pre-trained deep learning models such as Caffe, MXNET, and Tensorflow. The tool suite includes more than 20 pre-trained models, and supports 100+ public and custom models (includes Caffe*, MXNet, TensorFlow*, ONNX*, Kaldi*) for easier deployments across Intel® silicon products (CPU, GPU/Intel® Processor Graphics, FPGA, VPU).

    제품정보 바로가기
  • GRAND-C422-20D  [ SAMPLE ]

    Overview

    ●Intel® Xeon® W family processor supported
    ●6 x PCIe Slot, up to 4 dual width GPU cards
    ●Water cooling system on CPU
    ●Support two U.2 SSD
    ●Support one M.2 SSD M-key slot ( NVMe PCIe 3.0 x4 )
    ●Support 10GbE network
    ●IPMI remote management
    제품정보 바로가기
처음페이지 1 2 마지막페이지