Context
I'm building pinkysbrain (pinkysbrain.xyz) — an open-source, browser-based platform where anyone can play games against real neural recordings from the CL1, with a 3D brain visualization showing spike cascades, connectivity, and analysis metrics in real time.
The platform is built on the CL SDK. Game state gets encoded as stimulation patterns, neural responses are decoded via population analysis, and the full analysis toolkit (criticality, connectivity, firing stats) is rendered live alongside gameplay.
The ask
Would it be possible to get access to a few sample CL1 recordings in the SDK's HDF5 format? Specifically:
- Recordings from gameplay sessions (Pong or reaction tasks) where the neurons have been trained
- Or any recordings with structured stimulation and measurable learned responses
Even 2-3 short recordings would be enough to build a working public demo.
What this would enable
- A free, browser-based showcase of the CL1 and CL SDK — no hardware or cloud access needed
- Shareable content ("I just played Pong against 800K neurons") that drives awareness
- A community decoder challenge where developers experiment with spike interpretation strategies
- A direct funnel: users who want the real thing → Cortical Cloud
The project is MIT licensed and non-commercial. Neural recordings would be attributed to Cortical Labs and used in accordance with whatever license terms you specify.
Current state
- Landing page live at pinkysbrain.xyz with 3D neural visualization
- Architecture designed around the CL SDK's exact data format (HDF5 schema, WebSocket protocol, analysis API)
- Can run against the simulator today, but real recordings are what make it credible and shareable
Happy to discuss further here or on Discord.
Context
I'm building pinkysbrain (pinkysbrain.xyz) — an open-source, browser-based platform where anyone can play games against real neural recordings from the CL1, with a 3D brain visualization showing spike cascades, connectivity, and analysis metrics in real time.
The platform is built on the CL SDK. Game state gets encoded as stimulation patterns, neural responses are decoded via population analysis, and the full analysis toolkit (criticality, connectivity, firing stats) is rendered live alongside gameplay.
The ask
Would it be possible to get access to a few sample CL1 recordings in the SDK's HDF5 format? Specifically:
Even 2-3 short recordings would be enough to build a working public demo.
What this would enable
The project is MIT licensed and non-commercial. Neural recordings would be attributed to Cortical Labs and used in accordance with whatever license terms you specify.
Current state
Happy to discuss further here or on Discord.