Item logo image for FlowFRAM Connector

FlowFRAM Connector

ExtensionDeveloper Tools
Item media 3 (screenshot) for FlowFRAM Connector
Item media 1 (screenshot) for FlowFRAM Connector
Item media 2 (screenshot) for FlowFRAM Connector
Item media 3 (screenshot) for FlowFRAM Connector
Item media 1 (screenshot) for FlowFRAM Connector
Item media 1 (screenshot) for FlowFRAM Connector
Item media 2 (screenshot) for FlowFRAM Connector
Item media 3 (screenshot) for FlowFRAM Connector

Overview

Connect FlowFRAM to local or remote Runtime and Ollama instances

FlowFRAM Connector bridges the FlowFRAM web application with your local services, enabling two powerful capabilities from a single extension: ▶ RUNTIME PROXY — Run high-performance FRAM (Functional Resonance Analysis Method) simulations on your own machine or private network using FlowFRAM's distributed runtime agent. ▶ OLLAMA PROXY — Use your local Ollama LLM instance for AI-assisted analysis directly from flowfram.com. No API keys needed — your models, your hardware, your data. 🚀 Key Features: - Dual-purpose: Runtime + Ollama proxy in a single extension - Seamless connection between FlowFRAM and local runtime agents - Local LLM inference via Ollama (generate, chat, list models) - Real-time flow deployment and execution monitoring via SSE - Tabbed configuration: separate settings for Runtime and Ollama - Visual badge indicator: R (Runtime), O (Ollama), R·O (both), or red ! (error) - Automatic connection status detection - Origin header stripping for full Ollama compatibility - On-demand host permissions — only requests access when needed - Secure local communication without exposing data to external servers 💡 Why use a local runtime? - Execute thousands of simulation iterations per second - Access local APIs, databases, and network resources - Keep sensitive data on your own infrastructure - Work offline once connected 🧠 Why use local Ollama? - Run AI analysis with your own hardware — no cloud API costs - Full privacy: your prompts and data never leave your machine - Use any Ollama-supported model (Llama 3, Mistral, Gemma, Phi, etc.) - No API keys or subscriptions required 🔧 How it works: 1. Install the FlowFRAM Runtime Agent on your machine (Docker Hub image available) and/or install Ollama with a model 2. Install this extension and configure the service URLs (defaults: localhost:3010 for Runtime, localhost:11434 for Ollama) 3. Open FlowFRAM — the extension automatically bridges your browser to local services, bypassing CORS restrictions 📋 Requirements: - For Runtime: FlowFRAM Runtime Agent running locally or on your network (Docker: cgoudouris/flowfram-runtime) - For Ollama: Ollama installed with at least one model pulled (e.g., ollama pull llama3) - Both services are optional — enable only what you need 🔒 Privacy: - No data collection, no analytics, no tracking - Configuration stored locally via chrome.storage.local - All communication stays between your browser and your local services This extension is part of PhD research on complex systems modeling using the FRAM methodology. Learn more at https://flowfram.com/agent

Details

  • Version
    2.0.0
  • Updated
    February 27, 2026
  • Offered by
    cesar.goudouris
  • Size
    29.36KiB
  • Languages
    English
  • Developer
    Email
    cesar.goudouris@gmail.com
  • Non-trader
    This developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.

Privacy

Manage extensions and learn how they're being used in your organization
The developer has disclosed that it will not collect or use your data. To learn more, see the developer’s privacy policy.

This developer declares that your data is

  • Not being sold to third parties, outside of the approved use cases
  • Not being used or transferred for purposes that are unrelated to the item's core functionality
  • Not being used or transferred to determine creditworthiness or for lending purposes

Support

For help with questions, suggestions, or problems, visit the developer's support site

Google apps