Service Exposure Guide
Expose network ports from your GPU instances to the internet for APIs, web UIs, and other services.
Overview
Service Exposure allows you to expose network ports from your GPU instances to the internet, making services accessible via public URLs. This eliminates the need for SSH tunneling.
When you expose a service:
- Your internal port is mapped to an external port
- An external IP address and port are assigned
- Traffic from the internet is routed to your pod
- You receive a public URL to access the service
Common Use Cases
vLLM API Server
Expose your vLLM OpenAI-compatible API:
- Port: 8000
- Protocol: TCP
python -m vllm.entrypoints.openai.api_server --host 0.0.0.0 --port 8000Jupyter Notebook
Access Jupyter Lab or Notebook remotely:
- Port: 8888
- Protocol: TCP
TensorBoard
Monitor training metrics in real-time:
- Port: 6006
- Protocol: TCP
Gradio / Streamlit
Expose custom web applications:
- Gradio Port: 7860
- Streamlit Port: 8501
How to Expose a Service
Step 1: Start Your Application
Ensure your service is running and listening on the desired port. Important: Always bind to 0.0.0.0 (all interfaces), not localhost or 127.0.0.1.
# Example: Start vLLM API server
python -m vllm.entrypoints.openai.api_server \
--model your-model \
--host 0.0.0.0 \
--port 8000Step 2: Click "Expose Port"
In the Exposed Services section of your GPU card, click the "+ Expose Port" button.
Step 3: Fill in Service Details
| Field | Description | Example |
|---|---|---|
| Service Name | A friendly name (lowercase, hyphens allowed) | vllm-api |
| Port | Port your app is listening on | 8000 |
| Protocol | TCP (default) or UDP | TCP |
Step 4: Access Your Service
Once exposed, you'll see the external URL:
http://34.123.45.67:31234Click the copy button to copy the URL.
Managing Services
Viewing Exposed Services
All exposed services are shown in the "Exposed Services" section of your GPU card with their external URLs.
Deleting a Service
- Find the service in the Exposed Services section
- Click the delete (trash) icon
- Confirm the deletion
Note: External clients will lose access immediately.
Best Practices
Security
- Add authentication to sensitive services
- Use HTTPS when possible for production
- Limit exposed ports - only expose what you need
- Delete services when no longer needed
Naming
- Use descriptive names:
vllm-api,jupyter-main - Lowercase only, use hyphens not spaces
Troubleshooting
Service Not Accessible
Application not listening on 0.0.0.0:
# Wrong - only localhost can connect
python app.py --host 127.0.0.1 --port 8000
# Correct - external traffic can connect
python app.py --host 0.0.0.0 --port 8000Application not running:
ps aux | grep pythonWrong port number:
netstat -tlnp | grep 8000Connection Timeout
- Verify your GPU instance is running
- Check the service exists in the Exposed Services section
- Ensure your application is running and listening
Alternative: SSH Port Forwarding
For temporary access without exposing publicly, you can use SSH port forwarding instead:
# Forward remote port 8000 to local port 8000
ssh -p <ssh-port> -L 8000:localhost:8000 ubuntu@<host>Then access at http://localhost:8000 on your machine.
Need Help?
Contact us at support@packet.ai
