MIKAIA® Apps and AIs

MIKAIA® apps provide various analysis options for whole slide images (WSI), both in brightfield (BF) or fluorescence (FL). Used in conjunction, they build comprehensive workflows: The output of one app can serve as the input to another. 

Here, you will find an overview of all apps, AIs, and their functionalities, along with the MIKAIA® feature set they are included in (MIKAIA® studio, AI Add-on Bundle, and MIKAIA® lite).

MIKAIA® and all MIKAIA® Apps are for Research Use Only (RUO).

MIKAIA Apps and AIs

Download the app overview as a PDF.

MIKAIA® Feature Sets

MIKAIA® apps are currently available in three feature sets:

  • MIKAIA® lite – viewing, annotating, and converting slides: The free basic model offers numerous features. Download now.
  • MIKAIA® studio – offers an App Center with various image analysis apps and enables the creation and integration of custom AIs, all at a highly competitive price.
  • MIKAIA® studio + AI Add-on Bundle.

You already have MIKAIA® lite and want to switch to MIKAIA® studio? The upgrade to MIKAIA® studio is easily done by importing a license file.

MIKAIA® Apps & Functionalities

Tissue Detection App

Tissue Detection

Outlining tissue: The app separates foreground from background. It can also divide the slide into scan areas for individual statistics. This is achieved by grouping detected tissue particles or by performing TMA dearraying.

Tissue Analysis Apps

Mask by Color

Selecting tissue areas based on color: This app is a useful tool for manually defining regions of interest. It creates masks by thresholding one (or multiple) color channels, such as masking chromogens in IHC scans. In fluorescent mIF scans, it can mask specific markers (or combinations) to generate a ROI for subsequent cell analysis using the FL Cell Analysis App.

 

Tissue Analysis Apps

AI Author

DIY – Do It Yourself! Train your own patch-based classifier in three simple steps: 1. Define names of tissue classes you want to distinguish. 2. Annotate typical regions for these classes in one or more slides. Adapt the pre-trained AI based on these annotations. 3. Apply your own classifier to new regions or slides. If you are not happy with the accuracy yet, go back to step 2 to add a few more annotations.

 

Tissue Analysis Apps

H&E Colon Tissue AI

Identifying and outlining tissue types in a WSI: This colon classification app first detects tissue areas and groups them into visually similar clusters or regular tiles. Each cluster or tile is then analyzed by an AI trained to recognize seven classes: tumor cells, healthy mucosa, connective tissue or fat, muscle, mucus, necrosis, and inflammation.

 

Cell / Spot / Object Detection Apps

IHC Cell Detection

Detecting positive and negative cells in nuclear IHC stainings: The app counts positive (DAB+) and negative (H+) cells in IHC stainings to calculate statistics, e.g., cell amount and cell density in cells/mm2. The app is designed to be compatible with a wide range of antigens, including nuclear or cytoplasmic antigens. It can also be used to detect distinctly stained cells in other non-IHC stains.

 

Cell / Spot / Object Detection Apps

FL Cell Analysis

Single-cell analysis of immunofluorescence slides: The app segments cells, measures marker expression, and conducts differential ROI analysis. Phenotypes can be derived via user-defined cut-offs (supervised) or clustering (unsupervised). Cell segmentation uses fast computer vision or accurate CellPose 3 AI. Generated objects can be postprocessed with the Cell-Cell-Connections or the Cellular Neighborhood App.

 

Cell / Spot / Object Detection Apps

H&E Cell AI

Detecting and classifying cells in H&E-stained biopsies or resections: The app’s AI was trained primarily on colon to recognize these 11 cell types: epithelial cells, tumor cells, eosinophiles, lymphocytes, neutrophiles, macrophages, fibroblasts, endothelial cells, plasma cells, nerve cells, other cells. 

Cell / Spot / Object Detection Apps

H&E Cell AI (Detection only)

Counting cells and creating cell training sets: This “detection only” app version detects, segments, and outlines cells, without classifying them. Cells can be labeled by hand, e.g., using the class-changer brush.

 

Cell / Spot / Object Detection Apps

H&E Crypt AI

Performing pixelwise segmentation to outline crypts (or glands) and their lumens: The app delineates mucosal crypts and utilizes them as masks for downstream cell analysis, distinguishing between intra-crypt and inter-crypt regions.

 

Cell / Spot / Object Detection Apps

HER2/neu FISH

HER2 FISH scoring: The app analyzes DAPI, HER2, and CEP17 markers by detecting nuclei in the DAPI channel, tracing contours, and separating overlapping nuclei. It identifies red and green spots for HER2 and CEP17 amplifications, classifying each cell as "positive," "equivocal," or "negative." The overall HER2 to CEP17 ratio is computed by averaging over all detected cells.

 

Detected Objects Analysis Apps

Cell-Cell Connections

Performing spatial analysis between cell types: The app interprets samples as graphs where cells are nodes, cell-cell connections are edges. Each cell connects to its adjacent cells, and these connections are classified based on the cell types involved. Optionally, long connections can be filtered. The results are visualized in a histogram and displayed in a matrix table.

 

Detected Objects Analysis Apps

Spatial Clustering

Grouping adjacent cells or other annotation objects into clusters: The app outlines clusters and reports the number of objects contained in each cluster. Two adjacent cells are grouped into a cluster when their distance is less than a user-defined threshold. Additionally, a minimum number of objects per cluster can be required. Subsequently, clusters that contain less than the required amount of cells are removed.

Detected Objects Analysis Apps

Cellular Neighborhood

Classifying cells by their cellular neighborhood: The app collects information about each cell’s k nearest neighboring cells and then clusters the per-cell neighborhood data.

 

Detected Objects Analysis Apps

Annotation Metrics

Iterating over existing annotations and calculating metrics: The app computes morphometric (e.g., area, perimeter) as well as color metrics (e.g., mean fluorescence intensity per channel) for a given set of manually or automatically generated annotations. It’s useful for IHC Profiling and for computing the mean fluorescence intensity (MFI) per annotation or TMA core.

Data Export Apps

Annotation Image Export

Creating data sets from annotations: Export a single image per annotation for cells or small objects, or divide large annotation into patches, e.g., when large tissue regions are annotated. Tiles can be exported at native or user-defined resolutions, with configurable size and overlap. Optionally, grey-level segmentation masks can be generated from annotations.

Data Export Apps

Tile Export

Exporting tiles (or patches) from a whole-slide image:  Tiles can be exported at native or user-defined resolution, with configurable tile size and pixel overlap. Attributes like slide name and tile coordinates can be included in the file name according to a custom naming scheme. Optionally, grey-level segmentation masks can be generated alongside the tiles based on manually or auto-generated annotations.

 

Plug-in your own AI

Putting your AI into the hands of a pathologist: The MIKAIA® Plugin API can be used to plug in your own Python script (or any other language). It communicates with MIKAIA® via a REST API.