peopleanalyst

research / devplane

DevPlane research

A cockpit for multi-tool software development. Research is an empirical program on coordination cost in heterogeneous AI tool ecosystems — using DevPlane's continuous production telemetry as the apparatus, not the subject. Lead study: a pre-registered field test of risk compensation in human-AI coordination.

Why this matters

The portable claim — what this research lets you understand outside the surface domain.

The productivity claims being made for AI coding tools are largely grounded in agent-side measurements — lines produced, tasks completed, time-to-PR. If the Ironies of Automation (Bainbridge 1983) are operative — operator vigilance falling as agent reliability rises — those measurements systematically overstate net effect. The DevPlane research program tests that prediction with continuous production telemetry on a real operator running real agents on a real, multi-month codebase. The methodology generalizes: any team running heterogeneous tools through a coordination layer (multi-tool ops dashboards, hospital handoff systems, distributed scientific instruments) shares the same shape of problem.

Drill-down — full research surface

Seven-slot baseline. Forthcoming slots shown openly.

Audience tiers

The same headline research surfaced four ways: peer-review, engineering, general audience, product.

  • General-audience explainer

    General audience · forthcoming
  • Peer-review framing

    Peer-review framing · forthcoming
  • Engineering critique

    Engineering critique · forthcoming
  • Product implications

    Product implications · forthcoming