American giants face local and packaged AI

Their dominance in the cloud allows large platforms to impose themselves on artificial intelligence. For Antoine Corette, the founder of Alejandro, it was not too late for the European ecosystem to differentiate itself.

Antoine Corrett, founder of Alayar

Can European companies find their place in the artificial intelligence market against America’s major platforms?

AI should be an opportunity for Europe to get back into technology. Of course, for this, it is essential that its companies exist. Several levers contribute to this. The product is a market response. But it is also a business model and customer target problem.

Read more: China is expanding its influence through the new Digital Silk Road

Today, the SME and ETI segments do not necessarily want to be fully involved with large groups with which they have little control over the relational part. Products, business models and clients are three different factors in my opinion. I would add a fourth that is support and proximity to multi-local rather than just a global approach.

At the product level, how can the difference be expressed from Gafam?

It is a complex market. 75% dominance by AWS and Microsoft, regardless of cloud, data and AI. In order to exist, it is important to have a product separator that can create appeal.

At Aleia, for example, we rely on a platform that is undoubtedly more integrated and packaged that can deliver via Vertex with AWS Sagemaker or GCP. Cloud providers offer many components and users can create their own solutions as they wish. It’s a lot like Leroy Merlin, but in terms of being a handman.

In comparison, we offer more package services that guarantee greater ease of use. In a market where maturity is still early, packages provide security for companies. We apply this method to business models and pricing.

How does this packaging translate to product level?

The service is fully managed, meaning it includes all resource parts, including calculations and storage. This is fairly standard in the cloud universe, but the advantage is that it reduces the need for an AI project in terms of the IT resources required. Data scientists and data analysts can free themselves from infrastructural problems. This managed and package mode ready for use. This is important for simplification.

Our second difference is the collaboration with a store that integrates datasets and pre-trained models. For example, users have access to pre-trained NLP models, as well as French ontology.

We consider collaborations within a company platform, between a centralized data management and subsidiaries, and between companies in the same ecosystem.

Finally, we work on sovereign infrastructure through deployments with OVH and Scaleway. This is how we respond to data control challenges.

And in the case of the business model?

The architecture was designed for: We are part of the AI ​​software, but also integrate the cloud component. Unlike Databricks, for example, which separates License and Cloud, our package includes both components at the same price. Our customers know what they have to offer and what they will pay at the end of the month.

The product and business model is finally adding support levels. We defined a three-step approach, including maturity assessment, case generation, and implementation steps.

What kind of customers do such AI offers target?

The first type of client for which we are particularly relevant is creating data space. Our relevance lies in the collaborative and underlying sovereign infrastructure.

Classically, we target companies, SMEs and ETIs, but rather reassure a local player to call and understand their trust, control and performance issues as an SME.

At the end of the year, we will launch a segment with a community of data scientists and freelance AI developers.

And what about big companies?

We have the ability to serve these companies, which holds us in terms of speed and performance criteria for business themes. For example, we have agreements with Renault and Disney

We work with large enterprise customers from Amazon or Azure. We have no problem deploying to these cloud platforms. Even with them we have complementarity. Our solutions are packaged in Kubernet and can be deployed in any cloud.

Does the market today allow challengers to develop alongside cloud giants?

Sure. It is assumed that AI’s industrial setup rate fluctuates statistically between 5 and 15%. It will triple in 5 years and reach 70 to 80% by 2030 We are in a growing market that is more complex than a technically high quality cloud

In AI, there is more room for tactics with issues surrounding data privacy, ethics, and datasets. For vendors, it is possible to differentiate themselves in many ways, including AI performance and model training.

Do large companies and states have a role to play in shaping the European market?

We have big teams that are lucky in Europe. They have a role to play in this new AI market without making the prudent choice of choosing AWS, as they will choose IBM in another era. Similarly, public authorities, through public contracts, assume their responsibilities through local play. Alternatives exist even if the ecosystem still has to work on its readability by particularly large groups.

Leave a Comment