Inquir Compute logoInquir Compute
Changelog

What's new in Inquir Compute

Product-facing updates only — what you can use, not internal refactors.

v0.10.0Latest

Easier static pages and config with your code, the same file layout no matter which language you pick, more dependable Go projects, and AI chat that picks up where you left off.

  • NewWhen you deploy a Node function, everything you keep next to your code—pages, JSON, images, fonts—goes live with it, so serving HTML or reading a local file works the way you built it.
  • ImprovedReading templates or small data files from disk feels the same across Node, Python, and Go: your project files stay in a predictable place, so you spend less time guessing paths.
  • FixedGo functions that span several files or folders behave reliably after deploy and after you change something deep in the project—you always get the version you just saved.
  • NewThe in-editor AI assistant remembers the conversation for each function in your browser, so a refresh or coming back later does not wipe your thread.
v0.9.0

Go 1.22 runtime with streaming responses, Python and Go layer packs, and faster cold starts via bundled layer mounts.

  • NewGo 1.22 runtime — write functions in Go with full streaming response support
  • NewLayer packs for Python and Go: AI, HTTP-client, and database bundles now available for all three runtimes
  • NewLayer bundler merges all attached layers at deploy time — cold-start overhead drops on every invocation
  • ImprovedContainer pool reclaim logic tightened: idle containers exit faster, fewer leaked resources under low traffic
v0.8.0

Multi-file projects in the browser IDE, execution performance breakdown per invocation.

  • NewFile manager in the browser IDE: create, rename, and delete files inside a function without switching tools
  • NewSearch across all files in a function (Ctrl+Shift+F) with preview and inline jump-to-match
  • NewGo-to-symbol (Ctrl+Shift+O) for jumping to functions and exports within the current file
  • NewExecution performance panel: wall time, CPU share, and memory usage shown for each invocation
  • ImprovedEditor settings (tab size, font size, vim mode) persist per user account
v0.7.0

Python 3.12 runtime, reworked container pool lifecycle, multi-tool AI agent examples.

  • NewPython 3.12 runtime with native async/await and warm container reuse between invocations
  • NewStarter templates for Python: hello-world, HTTP API, data processing, ML inference, and error handling
  • NewMulti-tool AI agent examples: calculator, web search, scraper, and code-generation tools wired together as functions
  • ImprovedContainer pool eviction rewritten — containers are recycled more predictably under mixed-language workloads
v0.6.0

PostgreSQL backend, manual approval steps in pipelines, OAuth sign-in.

  • NewMigrated from embedded SQLite to PostgreSQL — functions, logs, and jobs survive process restarts and horizontal scaling
  • NewHuman-in-the-loop step in pipelines: pause a run and require a teammate to approve or reject before continuing
  • NewSign in with GitHub or Google; attach multiple OAuth providers to one workspace account
v0.5.0

Full observability: per-request traces, metrics dashboard, and live charts.

  • NewRequest traces with a span-by-span timeline — pinpoint exactly where latency or errors occur inside a function
  • NewMetrics dashboard: invocations, error rate, cold-start rate, and p95 duration per function
  • NewLive charts update via WebSocket — no page refresh needed to see current traffic and errors
  • NewAutomatic insights surface when error rate, cold starts, or timeouts exceed normal thresholds
v0.4.0

VS Code extension, full EN/RU localization, and clearer configuration workflows.

  • NewVS Code extension: deploy a file or folder, invoke a function, and view recent logs without leaving the editor
  • NewEnd-to-end English and Russian localization across the app, API error messages, and docs
  • NewPer-function environment variables from the test panel (Config → Environment) or API — merged with layers and injected into the container at invoke time
  • NewAPI keys for programmatic and gateway access, with rotation from the console
v0.3.0

Concurrency control, request queuing, and circuit breaker to keep functions stable under load.

  • NewPer-function concurrency limit — cap how many parallel calls a single function handles at once
  • NewGlobal request queue with configurable depth: excess requests wait instead of being rejected immediately
  • NewCircuit breaker: automatically stops routing to a function after 5 consecutive errors; reopens once it recovers
  • NewAPI Gateway supports server-sent events (SSE) — functions can stream data back to the client incrementally
v0.2.0

Pipeline orchestration, reusable layer packs, CLI tool, API Gateway, and Firecracker microVM support.

  • NewPipeline orchestration: chain functions sequentially or as a DAG, with branching and per-step error handling
  • NewLayer system: bundle npm packages once, attach to any function — AI, LangChain, database, HTTP, and Telegram packs included out of the box
  • NewCLI tool for deploying code and invoking functions from the terminal or CI pipelines
  • NewAPI Gateway with path parameters, wildcards, per-route auth (public / API key / bearer), rate limits, and CORS
  • NewFirecracker microVM backend for Linux/KVM: sub-second cold starts with hardware-level isolation
v0.1.0First release

First beta: Node.js 22 functions in Docker, hot container pool, async jobs, and isolated workspaces.

  • NewNode.js 22 functions running in isolated Docker containers — on-demand or always-warm
  • NewHot container pool: keeps containers alive between calls to skip cold starts on frequently invoked functions
  • NewAsync job queue: fire a function in the background and poll for its result
  • NewIn-browser code editor with TypeScript-aware completions powered by Monaco
  • NewAPI keys and session-based auth per workspace
  • NewHard multi-tenant isolation — each workspace has its own data, routes, and filesystem path

Want to follow along?

Inquir Compute logoInquir Compute

The simplest way to run AI agents and backend jobs without infrastructure.

Contact info@inquir.org

© 2025 Inquir Compute. All rights reserved.