Model
Google: Gemma 3n 4B (free)
Catalog snapshot from OpenRouter. This model is discoverable on-site even when it is not currently included in the global ranking list.
Data updated:
Data version: v20260430T083453Z Data size: 100
This model is currently available from the catalog snapshot and may not be included in the latest ranked board yet.
About this model
Google: Gemma 3n 4B (free) is listed in our model catalog as a LLM model with 8,192 ctx and a snapshot average price around $0.00 per 1M tokens. The data below is generated from the latest catalog snapshot and integration examples are provided when an OpenRouter id is available.
You can also explore more models from Google , and browse more options from 🇺🇸 United States .
Key metrics
- Rank
- Not ranked
- Kind
- LLM
- Core metric
- 8,192 ctx
- 1M tokens (avg)
- $0.00
- Vendor / team
- Origin
- 🇺🇸 United States
- License
- Proprietary
- VRAM requirement
- API-managed
Hippo's Quick Action
OpenRouter chat completions URL; set Authorization and body per docs.
Price comparison (snapshot)
| Source / aggregator | Price / 1M tokens | Latency |
|---|---|---|
| Snapshot average (board) | $0.00 | — |
Figures come from the imported leaderboard snapshot; live aggregator pricing and latency can change.
Token pricing by provider
Compare per-provider token prices for this model across available platforms.
| Provider | Input / 1M tokens | Output / 1M tokens | Status | Price updated |
|---|---|---|---|---|
| OpenRouter | $0.00 | $0.00 | Verified | 2026-04-30T08:36:20.578Z |
| OpenAI | $0.16 | $0.97 | Snapshot | 2026-04-29T05:13:01.370Z |
| Azure OpenAI | $0.16 | $0.97 | Snapshot | 2026-04-29T05:13:01.370Z |
| Groq | $0.16 | $0.97 | Snapshot | 2026-04-29T05:13:01.370Z |
| Fireworks | $0.16 | $0.97 | Snapshot | 2026-04-29T05:13:01.370Z |
| Together | $0.16 | $0.97 | Snapshot | 2026-04-29T05:13:01.370Z |
Provider prices are sourced from the token comparison dataset and may change between snapshots.
How to integrate
OpenRouter exposes an OpenAI-compatible Chat Completions endpoint. Use the tabs below to switch example languages. Replace the model id with the one from your provider page if you route elsewhere.
// Node.js 18+ — set OPENROUTER_API_KEY in your environment
const res = await fetch('https://openrouter.ai/api/v1/chat/completions', {
method: 'POST',
headers: {
'Authorization': `Bearer ${process.env.OPENROUTER_API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
model: "google/gemma-3n-e4b-it:free",
messages: [{ role: 'user', content: 'Hello' }],
}),
});
const data = await res.json();
console.log(data);# pip install requests
import json
import os
import requests
payload = {
"model": "google/gemma-3n-e4b-it:free",
"messages": [{"role": "user", "content": "Hello"}],
}
resp = requests.post(
"https://openrouter.ai/api/v1/chat/completions",
headers={
"Authorization": f"Bearer {os.environ['OPENROUTER_API_KEY']}",
"Content-Type": "application/json",
},
data=json.dumps(payload),
)
resp.raise_for_status()
print(resp.json())package main
import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
"os"
)
func main() {
key := os.Getenv("OPENROUTER_API_KEY")
payload := map[string]any{
"model": "google/gemma-3n-e4b-it:free",
"messages": []map[string]string{{"role": "user", "content": "Hello"}},
}
b, err := json.Marshal(payload)
if err != nil {
panic(err)
}
req, err := http.NewRequest(http.MethodPost, "https://openrouter.ai/api/v1/chat/completions", bytes.NewReader(b))
if err != nil {
panic(err)
}
req.Header.Set("Authorization", "Bearer "+key)
req.Header.Set("Content-Type", "application/json")
res, err := http.DefaultClient.Do(req)
if err != nil {
panic(err)
}
defer res.Body.Close()
out, err := io.ReadAll(res.Body)
if err != nil {
panic(err)
}
fmt.Println(string(out))
}import java.net.URI;
import java.net.http.HttpClient;
import java.net.http.HttpRequest;
import java.net.http.HttpResponse;
public class OpenRouterChat {
public static void main(String[] args) throws Exception {
String key = System.getenv("OPENROUTER_API_KEY");
String body = "{\"model\":\"google/gemma-3n-e4b-it:free\",\"messages\":[{\"role\":\"user\",\"content\":\"Hello\"}]}";
HttpRequest req = HttpRequest.newBuilder()
.uri(URI.create("https://openrouter.ai/api/v1/chat/completions"))
.header("Authorization", "Bearer " + key)
.header("Content-Type", "application/json")
.POST(HttpRequest.BodyPublishers.ofString(body))
.build();
HttpResponse<String> res = HttpClient.newHttpClient().send(req, HttpResponse.BodyHandlers.ofString());
System.out.println(res.body());
}
}import java.net.URI
import java.net.http.HttpClient
import java.net.http.HttpRequest
import java.net.http.HttpResponse
fun main() {
val key = System.getenv("OPENROUTER_API_KEY") ?: error("OPENROUTER_API_KEY")
val body = "{\"model\":\"google/gemma-3n-e4b-it:free\",\"messages\":[{\"role\":\"user\",\"content\":\"Hello\"}]}"
val req = HttpRequest.newBuilder()
.uri(URI.create("https://openrouter.ai/api/v1/chat/completions"))
.header("Authorization", "Bearer $key")
.header("Content-Type", "application/json")
.POST(HttpRequest.BodyPublishers.ofString(body))
.build()
val res = HttpClient.newHttpClient().send(req, HttpResponse.BodyHandlers.ofString())
println(res.body())
}// Cargo.toml: serde_json = "1", ureq = "2"
use serde_json::json;
fn main() -> Result<(), Box<dyn std::error::Error>> {
let key = std::env::var("OPENROUTER_API_KEY")?;
let model = "google/gemma-3n-e4b-it:free";
let body = json!({
"model": model,
"messages": [{"role": "user", "content": "Hello"}]
});
let resp = ureq::post("https://openrouter.ai/api/v1/chat/completions")
.set("Authorization", &format!("Bearer {}", key))
.set("Content-Type", "application/json")
.send_string(&body.to_string())?;
println!("{}", resp.into_string()?);
Ok(())
}import java.net.URI
import java.net.http.{HttpClient, HttpRequest, HttpResponse}
@main def run(): Unit =
val key = sys.env.getOrElse("OPENROUTER_API_KEY", throw new RuntimeException("OPENROUTER_API_KEY"))
val body = "{\"model\":\"google/gemma-3n-e4b-it:free\",\"messages\":[{\"role\":\"user\",\"content\":\"Hello\"}]}"
val req = HttpRequest.newBuilder()
.uri(URI.create("https://openrouter.ai/api/v1/chat/completions"))
.header("Authorization", s"Bearer $key")
.header("Content-Type", "application/json")
.POST(HttpRequest.BodyPublishers.ofString(body))
.build()
val res = HttpClient.newHttpClient().send(req, HttpResponse.BodyHandlers.ofString())
println(res.body())
}# macOS/Linux — set OPENROUTER_API_KEY in your environment
curl -sS 'https://openrouter.ai/api/v1/chat/completions' \
-H "Authorization: Bearer $OPENROUTER_API_KEY" \
-H "Content-Type: application/json" \
-d '{"model":"google/gemma-3n-e4b-it:free","messages":[{"role":"user","content":"Hello"}]}'Store API keys in environment variables or a secret manager—never commit them to source control.
Alternative picks
-
DeepSeek: DeepSeek V4 Flash
Compare now -
Writer: Palmyra X5
Compare now -
inclusionAI: Ling-2.6-1T (free)
Compare now
Pick one or two more models on global rankings and use Compare to view them side by side.