Factory.ai

Sovereign software development agents for American research

Factory gives you complete control over where your code and data flow. Choose your deployment model, your inference providers, and your telemetry destinations.

Built for environments where security is non-negotiable

Bring Your Own Models

Run any model on your infrastructure

Deploy US-based open-source models like GPT-OSS, Granite, or Nemotron. Connect to institutional gateways like LBNL CBorg. Or bring your own fine-tuned models. Factory is model-agnostic by design.

Model configuration docsArrow Right Icon

Flexible Deployment

Cloud, hybrid, or fully on-premises

Run Droids on laptops, CI runners, or HPC clusters. Route LLM traffic through your approved gateways. Air-gapped deployment available for classified environments that require full offline operation.

Privacy & data flowsArrow Right Icon

Security that scales from laptop to HPC cluster

Factory's hierarchical settings system ensures consistent governance across every environment where Droids run.

  • 01

    Hierarchical policy enforcement

    Org-level policies cascade down and cannot be overridden. Projects extend but never weaken. Users customize within allowed bounds. One config, consistent everywhere.

    Read the docs →
  • 02

    Deterministic agent controls

    Command risk classification with allow/deny lists. Droid Shield for secret scanning. Hooks for DLP integration. Safety built on hard boundaries, not model behavior.

    Read the docs →
  • 03

    Enterprise identity integration

    SSO via SAML/OIDC with Okta, Azure AD, or Google Workspace. SCIM provisioning for automated user lifecycle. Role-based access that flows from your IdP.

    Read the docs →
  • 04

    Full observability and audit

    OpenTelemetry-native with metrics, traces, and logs. Route telemetry to your own collectors or use Factory analytics. SOC 2, ISO 27001, and ISO 42001 certified.

    Read the docs →

US-based open-source models ready for deployment

Factory supports American-developed open-source models with full transparency into weights and training data. Deploy on your infrastructure with vLLM, TGI, or connect any model via standard APIs.

Llama

Meta

Nova

AWS

Nemotron

NVIDIA

Granite

IBM

Mistral

Mistral AI

OLMo

AI2

+ Any LLM via OpenAI, Anthropic, or custom endpoints

AI agents built for scientific computing

From accelerator operations to data pipelines, Factory Droids understand the unique demands of research infrastructure.

Accelerator Operations

AI co-pilots for fault diagnosis, root cause analysis, and routine tuning. Aggregate context across control systems, detect anomalies via historical comparison, and assist during faults with human-gated execution.

Beamline Automation

End-to-end experiment coordination from parameter selection to adaptive real-time analysis. Agents query materials databases, combine simulations with AI, and generate control-system plans.

Data Pipeline Engineering

Automate development of processing workflows, metadata capture systems, and analysis pipelines. Transform context-poor experimental data into ML-ready datasets with complete sample genealogy.

Scientific Software

Accelerate development of simulation codes, analysis tools, and control systems. Factory agents understand domain-specific languages, HPC patterns, and legacy Fortran codebases.

Knowledge Capture

Preserve institutional expertise by encoding tacit SME knowledge into machine-accessible formats. Reduce risk from personnel turnover and maintain operational continuity.

Safety-Critical Systems

Co-pilot patterns with strict separation of planning and execution. Read-only access by default, human-gated writes, and layered verification for high-consequence environments.

get in touch

See how Factory works for your industry

Contact sales

Arrow Right Icon