[HW Accel Support]: ARC A310 Support? Working Setup Questions. #14799
-
Describe the problem you are havingI dont have a problem, necessarily. Curious about my upgraded setup. I upgraded from an Intell 14500 Q670 platform, using an M.2 Coral card which worked pretty well and iGPU. I've had mostly success with A+E and M.2 Coral cards in a couple builds. I recently decided to make some changes, and consolidate to the newer z890 Intel Ultra Core platform and see if it would work. Overall the consolidation went well. The only issue I had was with the M.2 device. It has some problem potentially with power management. I keep getting a device inaccessible and some other errors so a driver or BIOS problem perhaps. So I became stuck. I could have reverted to the Tensor setup I had a config for and hardware for but want to keep the power down as much as possible, which is why I prefer Coral with iGPU. My setup is Proxmox >> VM >> Docker with PCIe passthrough. I grabbed a sparkle A310 card to play around with for transcoding and decided to just move my containers to the VM that has this card passed to it as passthrough. It shows up as /dev/dri is passed to the container from the VM. The VM has the IOMMU group passed to it from the host, so the A310 card is passed to the VM. That is what makes /dev/dri and card0 card1 and renderD128 available, I think. The iGPU is NOT passed to the VM. I then switch to use CPU type detection instead of coral in my frigate config.
`
So if im understanding this correctly, this should then be using the A310 for detect. Either way the setup is working extremely well. It's just as fluid and responsive as it was using the Coral card, I was going to maybe get the USB Coral but I'm not a fan of USB and at about $80 bucks I really didn't want another card even though it probably would have worked. What im trying to understand is how can I confirm that the GPU is working detect? I'm running 8 detect sub streams. I just dont see a reason to use a coral card now at all?:
detected inference is about 24ms ` Process Output: vainfo: VA-API version: 1.22 (libva 2.10.0) Version15-beta Frigate config fileNA docker-compose file or Docker CLI commandNA Relevant Frigate log outputNA Relevant go2rtc log outputNA FFprobe output from your cameraNA Install methodDocker Compose Object DetectorOther Network connectionWired Camera make and modelNA Screenshots of the Frigate UI's System metrics pagesNA Any other information that may be helpfulNo response |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 15 replies
-
using the |
Beta Was this translation helpful? Give feedback.
openvino looks MUCH better and the inference time is extremely reasonable.