D I S S E C T I N G B U S I N E S S
The future of AI inference lies in striking the right balance between cost, speed and availability. From edge versus cloud benefits to regulatory challenges and sustainability practices, organisations must think carefully about their decisions when it comes to AI. We spoke to Robin Ferris, Enterprise Architect, AI Lead at Pulsant, who talks about AI inference and how to create a digital infrastructure that’ s right for you.
OPTIMISING AI INFERENCE: COST, COMPLIANCE AND ENVIRONMENTAL IMPACT
W
Which do you believe offers more advantages, Edge inference or Cloud inference – and can you outline the benefits of each?
It comes down to what each of them can deliver for those working with AI models. One of the biggest things is local versus remote. Edge is much more local to the model and where the data is generated, whereas remote is elsewhere and the data has to travel to it. This therefore requires thought to be given to latency and privacy issues. That ' s where the conversation begins around what is being ingested, how that data is being dealt with and what the outcome is.
Some of the models we ' ve seen are ingesting real time live imagery compared with someone that ' s ingesting a data feed of information which could just be numbers. You’ d almost have
www. intelligenthealth. tech 29