Articul8

Articul8 and the shift to in-house generative AI platforms

Even as the major cloud service providers such as Amazon, Microsoft, and Google continue to accelerate their data centre build outs for generative AI services, there are many companies that prefer to establish their own training and inference infrastructure to protect the proprietary IP that is the lifeblood of their business.

Indeed, industry figures such as Hugging Face CTO Julien Chaumon have gone as far as to predict that local machine learning will be a big trend this year as companies look for ways to drive down costs and reduce the risk of data exposure while training models on third-party platforms.

Intel’s announcement of the formation of Articul8 AI together with the global investment firm Digital Bridge is another more concrete sign of this growing trend. By providing enterprise customers a full-stack, vertically optimized and secure generative artificial intelligence software platform, Articul8 will enable its customers to run all their data, training, and inference operations internally using a choice of cloud, on-premises, or hybrid deployment options.

By offering a complete hardware and software platform, Articul8 is addressing one of the key pain points that many enterprise customers are facing: namely, how to roll out an in-house generative AI platform at scale that is cost-effective, secure, and easy to develop on. Perhaps the best way of thinking of it is as generative AI in a box.

Initially, the company is targeting large-scale enterprises in the financial services, aerospace, semiconductor, and telecommunications segments. But over the longer term it is not hard to see the Articul8 platform and others like it being deployed by SMEs and even smaller operations such as a doctor’s surgery or a lawyer’s office looking to unlock the value from their specialist proprietary data in the safest and most efficient manner possible.

Scroll to Top