Guides
ComPDF AI Configuration Requirements
Basic Configuration Requirements
Running ComPDF AI requires the following minimum hardware specifications:
| Resource Type | Minimum Requirement |
|---|---|
| GPU Memory | ≥ 24 GB |
| CPU | ≥ 8 Cores |
| RAM | ≥ 16 GB |
| Storage | No specific requirements |
Note
GPU memory is a critical resource for running ComPDF AI. Please ensure your GPU memory meets the minimum requirement of 24 GB to guarantee proper model loading and inference.
Recommended Instance Type
For deploying and running ComPDF AI on AWS EC2, we recommend using the G6.xlarge instance type.
G6.xlarge Configuration Overview
| Specification | Configuration |
|---|---|
| GPU | 1 x NVIDIA L4 |
| GPU Memory | 24 GB |
| vCPU | 4 Cores |
| RAM | 16 GB |
Why Choose G6.xlarge?
The G6 series instances are equipped with NVIDIA L4 Tensor Core GPU, a high-performance graphics card specifically designed for AI inference and graphics-intensive workloads. The NVIDIA L4 offers the following advantages:
- 24 GB GDDR6 Memory: Fully meets the memory requirements of ComPDF AI, enabling smooth execution of intelligent document processing tasks.
- High Energy Efficiency: With a power consumption of only 72W, the L4 GPU maintains high performance while effectively controlling operational costs.
- AI Inference Optimization: Supports multiple precision formats including FP8 and INT8, deeply optimized for AI inference scenarios.
The G6.xlarge instance, with its excellent price-performance ratio and stable performance, is highly suitable as the runtime environment for ComPDF AI.