Skip to content

🤖 AI Framework Support

The iExec Platform provides comprehensive support for popular AI and machine learning frameworks. Deploy confidential AI with ease. iExec supports popular AI/ML frameworks, running in secure Trusted Execution Environments (TEEs) with optimized configurations.

🚀 Quick Start

Want to get started immediately?

🛡️ Why iExec for AI?

Confidential Computing

Trusted Execution Environments (TEEs) protect your AI models and data end-to-end:

  • Data Privacy: TEEs isolate AI computations in secure enclaves
  • Secure Training & Inference: Unauthorized entities can never access your models and data
  • Hardware-Level Security: Intel SGX and TDX provide enterprise-grade protection

AI Monetization

Monetize your AI assets easily and securely:

  • Datasets: Encrypt and sell access to your training data
  • Models: Deploy and monetize your trained AI models
  • Agents: Create and sell AI agents and applications
  • Ownership Preserved: Your digital assets always remain yours

Decentralized Infrastructure

Scale AI applications without centralized cloud dependencies:

  • On-Demand Compute: Access powerful resources when you need them
  • Fair Pricing: Blockchain verifies execution costs transparently
  • Global Network: Deploy across a worldwide network of secure workers

🤖 AI Framework Support

Overview

FrameworkTDX SupportSGX SupportBest For
TensorFlow✅ Yes (3.01GB)❌ NoDeep learning, production ML
PyTorch✅ Yes (6.44GB)❌ NoResearch, computer vision
Scikit-learn✅ Yes (1.18GB)✅ Yes (1.01GB)Traditional ML, data analysis
OpenVINO✅ Yes (1.82GB)❌ NoComputer vision, inference
NumPy✅ Yes (1.25GB)✅ Yes (1.08GB)Scientific computing
Matplotlib✅ Yes (1.25GB)✅ Yes (1.08GB)Data visualization

Framework Details

FrameworkVersionDescriptionTDX SupportSGX SupportUse CasesResources
TensorFlow2.19.0Google's ML framework for production AI✅ 3.01GB❌ Too largeDeep learning, CV, NLPDocsQuickstartDocker
PyTorch2.7.0+cu126Facebook's research-focused DL framework✅ 6.44GB❌ Too largeResearch, DL, CV, NLPDocsQuickstartDocker
Scikit-learn1.6.1Comprehensive ML library for Python✅ 1.18GB✅ 1.01GBClassification, regression, clusteringDocsExamplesDocker
OpenVINO2024.6.0Intel's high-performance AI inference toolkit✅ 1.82GB❌ Execution issuesComputer vision, inferenceDocsTutorialDocker
NumPy2.0.2Fundamental package for scientific computing✅ 1.25GB✅ 1.08GBScientific computing, data analysisDocsUser GuideDocker
Matplotlib3.9.4Comprehensive library for data visualization✅ 1.25GB✅ 1.08GBData visualization, plottingDocsGalleryDocker

🐳 Getting Started with Docker Examples

What's Included

Our AI Frameworks Hello World repository includes ready-to-use examples:

ai-frameworks-hello-world/
├── tensorflow/     # TensorFlow 2.19.0 example
├── pytorch/        # PyTorch 2.7.0+cu126 example
├── scikit/         # Scikit-learn 1.6.1 example
├── openvino/       # OpenVINO 2024.6.0 example
├── numpy/          # NumPy 2.0.2 example
└── matplotlib/     # Matplotlib 3.9.4 example

Quick Start Commands

bash
# Clone the repository
git clone https://github.com/iExecBlockchainComputing/ai-frameworks-hello-world.git
cd ai-frameworks-hello-world

# Try TensorFlow example
cd tensorflow
docker build -t hello-tensorflow .
docker run --rm hello-tensorflow

# Try PyTorch example
cd ../pytorch
docker build -t hello-pytorch .
docker run --rm hello-pytorch

Features

  • ✅ Isolated Testing: Each framework runs in its own container
  • ✅ Reproducible: Consistent environment across systems
  • ✅ TDX Ready: All containers tested for Intel TDX compatibility
  • ✅ Easy Deployment: Simple build and run commands

📊 Technology Comparison

TDX vs SGX for AI

FeatureIntel TDXIntel SGX
Memory LimitMulti-GB+~1.95GB
Framework SupportAll major frameworksLimited (Scikit-learn, NumPy)
Code ChangesMinimal ("lift and shift")Significant modifications required
Production Ready✅ Yes⚠️ Limited
AI Workloads✅ Excellent❌ Restricted

Recommendations

For Production AI Applications

  • Use TDX for TensorFlow, PyTorch, and OpenVINO
  • Use SGX for lightweight ML with Scikit-learn and NumPy

For Development and Testing

  • Start with SGX for simple ML tasks
  • Migrate to TDX for complex AI workloads

Important Considerations

  • SGX Limitations: Expect potential library incompatibilities and code modifications
  • TDX Advantages: Minimal code changes required ("lift and shift" approach)

📚 Next Steps

Learn TEE Technologies

Build AI Applications

Explore Examples