Gemini API Proxy

A secure and reliable gateway for your applications to interact with the Gemini AI.

Purpose
This service acts as a secure intermediary, allowing your client applications (e.g., Flutter) to access Google's Gemini API without exposing sensitive API keys directly on the client-side.

By routing requests through this proxy, you enhance security and maintain control over API usage.

API Endpoint
Interact with the Gemini API via the following endpoint:
/api/gemini-proxy

Method: POST

Body (JSON):

{
  "prompt": "Your query for Gemini AI..."
}

The Gemini model used is determined by the GENKIT_DEFAULT_MODEL environment variable, or a hardcoded fallback if that's not set.

Security
Your connection and data are handled securely.
  • API Key Protection: Your actual Gemini API key is stored securely on the server and never exposed to the client.
  • Proxy Authentication: Requests to this proxy must include an Authorization header: Bearer YOUR_PROXY_API_KEY.
  • CORS Policy: Implemented to ensure requests are accepted from authorized origins. During development (NODE_ENV=development), any localhost origin is allowed. In other environments, only requests from the APP_DOMAIN are permitted.
  • Rate Limiting: Basic IP-based rate limiting is active (default: 20 requests/IP/minute) to help prevent abuse. This is an in-memory limit per server instance.
Setup & Deployment Guide
Follow these steps to configure, deploy, and use your Gemini API Proxy service.

1. Configure Environment Variables

Create a .env file in the root of your project for local development. For deployment, set these variables in your hosting environment's settings.

  • PROXY_API_KEY

    A secret key your client applications will use to authenticate with this proxy service. Generate a strong, unique key.

    PROXY_API_KEY=your_strong_unique_proxy_secret_key
  • GOOGLE_API_KEY (or configure Application Default Credentials)

    Your API key for accessing Google's Gemini API. Obtain it from Google AI Studio or Google Cloud Console. Alternatively, if your server environment supports it (e.g., Google Cloud services), you can use Application Default Credentials (ADC), and Genkit will pick them up automatically.

    GOOGLE_API_KEY=your_google_ai_gemini_api_key

    Important: This key grants access to Google AI services. Keep it secret and secure on the server. Never expose it in client-side code.

  • APP_DOMAIN (Required for Production CORS)

    The full origin (e.g., https://your-client-app.com) of your client application that will call this proxy. This is crucial for the CORS policy to restrict access to your specified domain in production environments. During local development (NODE_ENV=development), any localhost origin is automatically allowed, and this variable is not strictly needed for CORS but is good practice to set for consistency.

    APP_DOMAIN=https://myfrontendapp.com
  • GENKIT_DEFAULT_MODEL (Recommended)

    Specifies the default Gemini model to be used by Genkit. If this variable is not set, the application falls back to a hardcoded model (currently googleai/gemini-2.0-flash). Set this to your preferred model, like googleai/gemini-pro.

    GENKIT_DEFAULT_MODEL=googleai/gemini-pro

Model Selection (Order of Precedence):

  1. Value of the GENKIT_DEFAULT_MODEL environment variable (if set).
  2. Hardcoded fallback model in the application (currently googleai/gemini-2.0-flash).

2. Deploy the Proxy Service

Deploy this Next.js application to your preferred hosting provider (e.g., Firebase App Hosting, Vercel, Netlify, Google Cloud Run). Ensure all the environment variables mentioned above are correctly configured in your hosting provider's settings dashboard. The apphosting.yaml file is preconfigured for Firebase App Hosting.

3. Client Application Integration

In your client application (e.g., Flutter, web app, mobile app):

  • Make POST requests to the API endpoint of your deployed proxy:
    /api/gemini-proxy
  • Include the Authorization header with your PROXY_API_KEY:
    Header('Authorization', 'Bearer YOUR_PROXY_API_KEY_HERE')
  • Send a JSON request body containing the prompt:
    {
      "prompt": "Your query for Gemini AI..."
    }
  • Handle the JSON response, which will be in the format:
    {
      "result": "Response from Gemini AI..."
    }

Example: Testing with cURL

You can test the proxy endpoint using curl from your terminal. Make sure to replace YOUR_PROXY_API_KEY_HERE with your actual proxy API key. The URL in the example below uses the current page's origin; replace it if you are testing a different deployment.

curl -X POST "/api/gemini-proxy" \
  -H "Authorization: Bearer YOUR_PROXY_API_KEY_HERE" \
  -H "Content-Type: application/json" \
  -d '{
    "prompt": "Explain quantum computing in simple terms."
  }'

4. Rate Limiting & Abuse Detection

The proxy includes a basic built-in rate limiter to protect against simple abuse:

  • Mechanism: It's IP-based and operates in-memory for each server instance.
  • Default Limits: By default, it allows 20 requests per IP address per 1 minute window.
  • Customization: These default values (MAX_REQUESTS_PER_WINDOW and RATE_LIMIT_WINDOW_MS) are currently hardcoded in src/app/api/gemini-proxy/route.ts. For different limits, you would need to modify this file. For distributed environments, consider using a shared store like Redis for rate limiting.

For more robust abuse detection, consider integrating services like Firebase App Check:

  • Firebase App Check: Helps protect your backend resources from abuse, such as billing fraud or phishing. A placeholder comment for integrating App Check can be found in src/app/api/gemini-proxy/route.ts.
  • Implementation: You would typically set up App Check in your Firebase project and then verify the App Check token in your proxy's API route. See the Firebase App Check documentation for details.