Flowio System
Battle-tested skill for parse, flow, cytometry, standard. Includes structured workflows, validation checks, and reusable patterns for scientific.
FlowIO System
A scientific computing skill for handling Flow Cytometry Standard (FCS) files using FlowIO — the lightweight Python library for reading and writing FCS 3.1 files containing flow cytometry measurement data.
When to Use This Skill
Choose FlowIO System when:
- Reading FCS files from flow cytometry instruments
- Extracting channel names, metadata, and event data
- Converting FCS data to pandas DataFrames for analysis
- Writing processed data back to FCS format
Consider alternatives when:
- You need full flow cytometry analysis (use FlowCytometryTools or CytoFlow)
- You need automated gating (use OpenCyto or FlowSOM)
- You need visualization of cytometry plots (use matplotlib/seaborn)
- You need single-cell analysis beyond flow cytometry (use Scanpy)
Quick Start
claude "Read an FCS file and convert the flow cytometry data to a DataFrame"
import flowio import pandas as pd import numpy as np # Read FCS file fcs = flowio.FlowData("sample.fcs") # Extract metadata print(f"Channels: {fcs.channels}") print(f"Events: {fcs.event_count}") print(f"Parameters: {len(fcs.channels)}") # Get channel names channel_names = [] for i in range(1, len(fcs.channels) + 1): pnn = fcs.channels[i].get("PnN", f"P{i}") # Parameter name pns = fcs.channels[i].get("PnS", "") # Parameter stain channel_names.append(pns if pns else pnn) # Convert to DataFrame events = np.reshape(fcs.events, (-1, len(fcs.channels))) df = pd.DataFrame(events, columns=channel_names) print(f"\nDataFrame shape: {df.shape}") print(df.describe())
Core Concepts
FCS File Structure
| Section | Contents | Access |
|---|---|---|
| Header | File version, offsets | fcs.header |
| TEXT | Metadata key-value pairs | fcs.text |
| DATA | Event measurements | fcs.events |
| ANALYSIS | Optional analysis results | fcs.analysis |
Channel Information
# Detailed channel inspection for i in range(1, len(fcs.channels) + 1): ch = fcs.channels[i] print(f"Channel {i}:") print(f" Name (PnN): {ch.get('PnN', 'N/A')}") print(f" Stain (PnS): {ch.get('PnS', 'N/A')}") print(f" Range (PnR): {ch.get('PnR', 'N/A')}") print(f" Bits (PnB): {ch.get('PnB', 'N/A')}") print(f" Gain (PnG): {ch.get('PnG', 'N/A')}") # Common channels # FSC-A: Forward scatter area (cell size) # SSC-A: Side scatter area (granularity) # FITC-A, PE-A, APC-A: Fluorescence channels # Time: Acquisition time
Compensation and Transformation
# Apply compensation matrix (spillover correction) spill_text = fcs.text.get("SPILL", None) if spill_text: parts = spill_text.split(",") n_channels = int(parts[0]) comp_channels = parts[1:n_channels+1] matrix_values = [float(v) for v in parts[n_channels+1:]] comp_matrix = np.reshape(matrix_values, (n_channels, n_channels)) # Apply compensation comp_indices = [channel_names.index(c) for c in comp_channels] comp_data = df.iloc[:, comp_indices].values compensated = np.linalg.solve(comp_matrix.T, comp_data.T).T df.iloc[:, comp_indices] = compensated # Logicle/arcsinh transformation for fluorescence channels def arcsinh_transform(data, cofactor=150): return np.arcsinh(data / cofactor) fluor_cols = [c for c in df.columns if c not in ["FSC-A", "SSC-A", "Time"]] for col in fluor_cols: df[f"{col}_transformed"] = arcsinh_transform(df[col])
Configuration
| Parameter | Description | Default |
|---|---|---|
data_type | Float or integer event storage | Auto-detect |
compensation | Apply spillover correction | false |
transformation | Logicle, arcsinh, or biexponential | None |
cofactor | Arcsinh transformation cofactor | 150 |
subsample | Max events to load | All events |
Best Practices
-
Always apply compensation before analysis. Flow cytometry channels bleed into each other due to spectral overlap. Extract the spillover matrix from the FCS file's SPILL keyword and apply compensation before gating or visualization.
-
Transform fluorescence channels for visualization. Raw fluorescence data is log-normal distributed. Apply arcsinh or logicle transformation for biologically meaningful visualizations. Keep scatter channels (FSC, SSC) on linear scale.
-
Check event count matches expectations. Verify
fcs.event_countmatches your acquisition settings. Truncated files (instrument errors, disk full) may have fewer events than expected. Reshaping the events array with the wrong event count produces corrupted data. -
Use PnS (stain name) when available. Channels have both parameter names (PnN, like "FITC-A") and stain names (PnS, like "CD3"). Stain names are biologically meaningful and make downstream analysis more interpretable.
-
Subsample large FCS files for exploration. Files with millions of events consume significant memory. Randomly subsample 10,000-50,000 events for initial exploration and visualization, then process the full file for quantitative analysis.
Common Issues
Events array reshape fails with dimension mismatch. The total number of values in fcs.events must be divisible by the channel count. If not, the file may be corrupted or the channel count in metadata is incorrect. Verify with len(fcs.events) / len(fcs.channels).
Channel names are generic (P1, P2) instead of descriptive. Some instruments don't populate the PnS field. Check both PnN (parameter name, like "FITC-A") and PnS (stain name, like "CD3"). If both are generic, you'll need the panel configuration from the experiment.
Compensated data has negative values. This is expected after spillover correction. Negative values occur when compensation subtracts more spillover than actual signal in dim populations. Use arcsinh transformation (which handles negatives) rather than log transformation for compensated data.
Reviews
No reviews yet. Be the first to review this template!
Similar Templates
Full-Stack Code Reviewer
Comprehensive code review skill that checks for security vulnerabilities, performance issues, accessibility, and best practices across frontend and backend code.
Test Suite Generator
Generates comprehensive test suites with unit tests, integration tests, and edge cases. Supports Jest, Vitest, Pytest, and Go testing.
Pro Architecture Workspace
Battle-tested skill for architectural, decision, making, framework. Includes structured workflows, validation checks, and reusable patterns for development.