“This is a cutting-edge platform that will change on several revisions, as we will keep on meeting with challenges and finding new solutions”

Published on 18/02/2022

 

Interview with Alberto Gutiérrez and Josep Lluís Berral, INCISIVE’s partners from the Barcelona Supercomputing Center-Centro Nacional de Supercomputación (BSC-CNS), who are leading the ‘Platform DevOps Setup work and Data Sources Integration’ work package (WP3).

What is the role of your work package in the project?

The WP3 has the key role of providing the implementation, deployment and management mechanisms of the infrastructure for the INCISIVE platform, designed in WP2 and coordinated with WP4 and WP5 for the AI applications and the Federated Storage. This infrastructure provides the AI applications of WP4 an execution framework, and the WP5 Federated Storage a deployment environment. As part of the infrastructure and deployment, security and access management to data and tools is also designed and integrated into this work package. And, as mentioned before, the infrastructure efforts are completed with the modules to orchestrate and optimize workloads.

Besides, aside from the infrastructure, the types, standardization and format of data is dealt here, managing and integrating all the data in a Common Data Format to enable interoperability between all the data provided by the partners. By doing so, annotated medical data and applications created to annotate such data can be used from one partner to another, without having to adapt it. This is a mandatory requirement for researchers developing AI tools and medical analysis, and medical data users.

“Knowledge between medical partners can be shared safely without critical data leaving their tenant premises and domains”

 

Which results do you expect?

The results of this work package will be the implementation and deployment of the infrastructure for the INCISIVE platform, repository and AI toolbox, and the standards for data sharing and annotation. Through the designs and selected technologies, the platform will be able to perform Federated Learning and Inference on environments where data movement is restricted although knowledge (the models) can be shared. The federated schema of data modeling allows the conceptual security that knowledge between medical partners can be shared safely without critical data leaving their tenant premises and domains. Also, it will allow AI researchers to improve prediction and forecasting models from federated data without risking that data leaving its domains. Pushing forward a common data model helps in the future integration of new Data Partners and donors, defining a standard procedure to transform the data to be used.

In October 2021, your started working on the Resource Management and Orchestration task. What is your main focus?

This task, internally named ‘Task 3.4’, is the one providing the design of the workload pipelines and technologies allowing the federation of processes on which the AI toolbox is built upon. This task indicates how AI models are trained and used in a multi-user environment, instead of isolated and disconnected groups or datasets. Further, it provides the policies and methods to manage the resources that are available to ensure the correct performance of the AI workloads.

The task focuses on the resources and applications orchestration. This is, how to assign and provision resources to the AI applications needed to execute, decide the proper placement for them, and optimize such decisions to reduce used resources while accomplishing the application and user requirements. The main component, the orchestrator, is the underlying layer that operates the execution of such AI workloads in the form of “Containers”. To avoid having thousands of different AI applications with different input/output interfaces in the INCISIVE AI Toolbox, we set a standard (a container), that is the envelope where AI developers insert their AI tools and adjust their inputs and outputs to an agreed common interface. That way, users just obtain the container with their desired AI application, and interact with it through the standard interface. The orchestrator is the element that deploys those containers according to the users and platform demands. In addition, and as part of the INCISIVE platform design, we are adding policies and procedures to the orchestrator towards allowing the coordination between partners to allow Federated Learning, where all interested partners train a model using the same container application, and then share or merge the results. This involves managing Cloud and Federated Node resources.

“The integration is bound to happen by the end of March 2022, a very promising and exciting moment in which we will see everything working together for the first time!”

 

Finally, what are your next steps?

At this time, we are building the first prototype of the workload orchestration and pipeline, to test the fundamental functions and operations, also proof for the federated paradigm. The next step is to complete and integrate the prototype, by adding support to the different scenarios on a federated approach for AI. Then, integrate with the rest of the INCISIVE platform and repositories. The integration is bound to happen by the end of March 2022, a very promising and exciting moment in which we will see everything working together for the first time!

From this integration on, the requirements of the Orchestrator and the AI management will be iterated and refined towards the next prototype. As this is a cutting-edge platform that will change on several revisions, as we will keep on meeting with challenges and finding new solutions. By the end of the day, all efforts will contribute to other similar projects providing the obtained know-how in this project, and also help researchers looking to do this kind of analysis on medical data, by using the INCISIVE platform.

 

Update June 2022: The first prototype has already been launched. You can read the news article on this link. 

For more information

Print friendly pdf

Press contact