Skip to content
Snippets Groups Projects
Commit c361e380 authored by ac (tb)'s avatar ac (tb)
Browse files

Merge remote-tracking branch 'lsc/master'

parent 67de954d
No related branches found
No related tags found
No related merge requests found
sampleImages/* filter=lfs diff=lfs merge=lfs -text
# Ignore python module cache
__pycache__
docs/assets/leversc-interface.jpg

228 KiB

# LEVERSC: Cross-Platform Scriptable Multichannel 3-D Visualization for Fluorescence Microscopy Images (Online Methods)
# Architecture
The LEVERSC visualization tool is a [node.js](https://nodejs.org) application for visualization of multichannel 3-D volumetric data using webgl.
LEVERSC uses a local HTTP server port binding to communicate with image processing tools.
Currently LEVERSC has plugins for ImageJ, Python and MATLAB. Additional plugins for KNIME and Julia are planned.
This architecture is very flexible and supports fast, cross-platform communication between any image processing environment that supports HTTP POST/GET requests.
A detailed API breakdown follows, as well as example usage from Python and MATLAB.
# Usage Example
## Scripted Movie Rendering (in MATLAB)
In this section we will discuss in detail the usage of the script [API interface](#api) for rendering high-quality presentation movies.
The full script ([```sampleVolumeMovie.m```](src/MATLAB/sampleVolumeMovie.m)) is available in the ```src/MATLAB``` directory of the LEVERSC repository.
### Movie setup
We begin by loading some image data, in this case from the sample .LEVER file distributed along with the LEVERSC repository.
```matlab
%% Load image (in this case from a LEVER file)
strDB='../../sampleImages/lscSampleImage.LEVER';
[im,CONSTANTS]=leversc.loadImage(strDB);
```
A leversc class object must be initialized, here we initialize the leversc class with image data and metadata (metadata fields such as ```PixelPhysicalSize``` are important for correct data visualization).
```matlab
% Initialize leversc class with image and metadata
lsc=leversc(im,CONSTANTS);
```
A reproducible movie render should set the rendering parameters consistently at the beginning of rendering to properly visualize the data.
In this case we used the LEVERSC tool interface to interactively identify good visualization values, then used the ```/renderParams``` API call to read the current settings.
The image below shows the LEVERSC interface for selecting visualization (rendering) parameters:
![LEVERSC render parameter selection interface](assets/leversc-interface.jpg "LEVERSC visualization parameter selection")
The selected rendering parameters were hard-coded into our rendering script.
```matlab
%% Set up pre-selected render properties
lsc.renderParams(1).alpha = 1;
lsc.renderParams(1).dark = 0;
lsc.renderParams(1).medium = 0.78;
lsc.renderParams(1).bright = 0.96;
lsc.renderParams(1).color = [1;1;1];
```
We disable the display of most UI elements so
```matlab
%% Disable most UI elements
lsc.uiParams.sidebar='none';
lsc.uiParams.clockButton='none';
lsc.uiParams.webToolbar='none';
```
We reset the view parameters to defaults for the start of the movie:
```matlab
%% Reset view parameters (also set background to black)
lsc.viewParams.bClip = 0;
lsc.viewParams.worldRot = reshape([1,0,0,0; 0,1,0,0; 0,0,1,0; 0,0,0,1], 16,1);
lsc.viewParams.zoom = 0;
lsc.viewParams.bgColor = [0,0,0];
```
The first step in this movie is to apply a quick animated zoom to fill the display with the actual data in the volume, capturing frames for each zoom level.
Since our movie will run at 10 frames per second (fps) we interpolate our zoom over 10 frames (a 1 second zoom):
```matlab
%% Quick zoom from 0 to 0.4 (1 sec = 10 frames)
zlevels = linspace(0,0.4, 10);
for i=1:length(zlevels)
lsc.viewParams.zoom=zlevels(i);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
```
Next we apply a 5 second (50 frame) rotation of 180 degrees about the y-axis:
```matlab
%% Rotate halfway around in 5sec
angles = linspace(0,180,50);
for i=2:length(angles)
ry=[cosd(angles(i)), 0, sind(angles(i)),0;
0,1,0,0;
-sind(angles(i)),0,cosd(angles(i)),0;
0,0,0,1];
worldRot=ry;
lsc.viewParams.worldRot=worldRot(:);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
```
Next we move the sampling plane to the edge of the volume and turn on planar clipping.
Then we animate moving the plane to just a little back from the volume center.
The plane animation is 2 seconds (20 frames) long:
```matlab
%% Put sample plane at the back of the volume and set to planar clipping mode,
%% then animate plane movement into just back from center
planeZs = linspace(85,50, 20);
lsc.viewParams.planeCenter(3) = planeZs(1);
lsc.viewParams.bClip = 1; % Plane-clipping mode
for i=1:length(planeZs)
lsc.viewParams.planeCenter(3) = planeZs(i);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
```
We apply another 180 degree rotation to rotate the volume the rest of the way back to the starting view.
This time we show matrix multiplication for the world rotation matrix by a delta rotation matrix:
```matlab
%% Rotate the rest of the way around with plane clipping on (5sec)
% This time we use a "delta" rotation matrix to show how matrix
% products can be used
frameRots = 50;
angleDelta = 180 / frameRots;
deltaRY = [cosd(angleDelta), 0, sind(angleDelta),0;
0,1,0,0;
-sind(angleDelta),0,cosd(angleDelta),0;
0,0,0,1];
worldRot = reshape(lsc.viewParams.worldRot, 4,4);
for i=1:frameRots
worldRot = deltaRY * worldRot;
lsc.viewParams.worldRot = worldRot(:);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
```
As a final animation, we change the plane clipping mode to slice sampling, and animate moving out of the volume towards the camera:
```matlab
%% Change to slice clipping mode and pull plane back toward camera
startZ = lsc.viewParams.planeCenter(3);
planeZs = linspace(startZ,1,20);
lsc.viewParams.bClip = 2; % Slice-clipping mode
for i=1:length(planeZs)
lsc.viewParams.planeCenter(3) = planeZs(i);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
```
If run with the ```bPreview``` flag set to false, this will generate ```lscSampleMovie.mp4```.
**NOTE:** The LEVERSC window size will determine the resolution of the video produced, so it is important to set the window size appropriately before running the video script.
# API
## Ports
LEVERSC *Figure* windows are represented by port bindings, beginning at port 3001 for figure 1 and port 3002 for figure 2, etc.
## ```/loadfig/:fignum (POST)```
This request posts a complete volume to the LEVERSC figure window.
**TODO:** Payload details for the loadfig call
## ```/renderParams (GET/POST)```
This request sends or receives the volume parameters (channel colors and transfer functions) which control the volumetric visualization.
The request/response payload is a JSON array representing colors and transfer functions per-channel (one object for each channel):
```json
[
// Per-channel entry (default values are shown below)
{
"bVisible": true, // Show channel in rendering
"alpha": 1, // Channel opacity for blending with other channels
"dark": 0, // Minimum cutoff for dark values anything below "dark" will map to 0
"medium": 0.5, // Curve midpoint intensity (0.5 is linear)
"bright": 1, // Saturation for bright values anything above "bright" will map to 1
"color": [1,0,0] // Channel color (red is default for channel 1)
},
// ... entries for channels 2 to num_channels
]
```
## ```/screenCap (GET)```
This request receives a capture image from the LEVERSC figure window.
This can be used to render high-quality scripted movies from Python or MATLAB.
The response payload is a **PNG image** of the captured window.
## ```/strDB/:strDB```
This request can be used to visualize an on-disk LEVER file rather than an image file, the ```:strDB``` argument must be a url-guarded fully qualified path to the .LEVER file.
## ```/uiParams (GET/POST)```
This request sends or receives the display status of interface view elements (UI elements) such as the view sidebar, toolbar buttons, and the scale bar.
This is generally used for clearing interface elements during movie rendering.
The request/response payload is a JSON object containing UI element selections:
```json
// Default values for fields are given below
{
"sidebar": "block", // HTML display style of the UI sidebar ("none" to disable)
"webToolbar": "block", // HTML display style of the UI toolbar buttons ("none" to disable)
"logoDiv": "block", // HTML display style of the LEVER logo image ("none" to disable)
"clockButton": "block", // HTML display style of the UI clock button ("none" to disable)
"time": 1, // Frame number displayed in the UI clock element
}
```
## ```/viewParams (GET/POST)```
This request sends or receives the camera and sampling plane parameters that control the view of the volume.
The request/response payload is a JSON object containing view parameters:
```json
// Default values for fields are given below
{
"zoom": 0, // Amount of camera zoom between 0 (zoomed out) and 1 (fully zoomed in), affects the camera field of view.
"pos": [0,0,-5], // Camera position in world coordinates (only used for pan, see worldRot for rotating the volume)
"worldRot": [1,0,0,0, // 4x4 world rotation matrix. See the movie
0,1,0,0, // making example for an example of programatic
0,0,1,0, // control of the rotation.
0,0,0,1],
"bClip": 0, // Sampling plane mode (0,1, or 2)
// 0 - No clipping
// 1 - Clip front (display data behind the clipping plane)
// 2 - Slice clipping (Sample only the plane intersection slice)
"planeCenter": [xDim/2,yDim/2,zDim/2], // Sample plane location in image coordinates. NOTE: this is a point on the plane, the plane orientation is determined by the view direction (worldRot).
"bgColor": [0.4,0.4,0.4,1], // Canvas background color (including alpha)
"volColor": [0,0,0,1], // Volume background color
}
```
All software in this project is copyright (c) 2015-2021 Drexel University
This release of the leversc multi-channel 3-d viewer is not yet public. It is intended for
reviewer evaluation,and for select external collaborators working with Dr. Cohen's group. For now, the
software should be considered "unlicensed". Please do not redistribute or share.
once the manuscript is accepted, the code will be available free & open source. the code will be licensed
under the MIT agreement, reproduced below for reference.
andy cohen
april 2021
https://bioimage.coe.drexel.edu
proposed license to be applied to leversc source after manuscript acceptance:
MIT License
Copyright (c) 2015-2021 Drexel University
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
# LEVERSC: Cross-Platform Scriptable Multichannel 3-D Visualization for Fluorescence Microscopy Images
LEVERSC is a cross-platform multichannel 3-D visualization tool for volumetric fluorescent images such as those produced by confocal or light-sheet microscopy.
## Installation
Install the LEVERSC app for your operating system, then follow the instructions for integrating LEVERSC into your client of choice ([ImageJ](#imagej-plugin), [Python](#python-module), or [MATLAB](#matlab-class)).
### **App Install**
### MacOS App Install
- Download and run the [MacOS installer](https://leverjs.net/download/app/Mac/leverjs-21.4.1.dmg).
### Windows App Install
- Download and run the [Windows installer](https://leverjs.net/download/app/Windows/leverjs%20Setup%2021.4.1.exe).
### Linux App Manual Install
1. Download the [Linux Appimage](https://leverjs.net/download/app/Posix/leverjs-21.4.1.AppImage).
2. Symlink the appimage file to a folder in ```$PATH``` (e.g. ```~/.local/bin```).
```bash
ln -fs /path/to/leverjs-*.AppImage ~/.local/bin/leverjs
```
### **Client Install**
### ImageJ Plugin
1. Download the [ImageJ plugin](https://leverjs.net/download/client/imagej-plugin/Leversc_IJ-21.4.1.jar).
2. Copy the plugin jar file to the ```plugins/3D``` folder in your ImageJ executable directory.
3. The plugin will appear in the ImageJ Plugins menu as ```Plugins->3D->Leversc Viewer```
**Note:** If using Fiji (**highly recommended**) then you may wish to create the ```3D``` subfolder in ```plugins``` and place the jar file within, as it can make it easier to find. Alternatively, the jar file can be placed directly in the ```plugins``` folder and will be listed in the Fiji menu as ```Plugins->Leversc Viewer```.
### Python Module
1. Download the [Python module](src/Python) directory (Select the download icon and choose ```Download this directory```).
2. Extract the downloaded Python folder to a convenient location.
3. Add the folder to your ```PYTHONPATH``` environment variable so that ```import leversc``` statements can automatically find the LEVERSC module
### MATLAB Class
1. Download the [MATLAB class](src/MATLAB) directory (Select the download icon and choose ```Download this directory```).
2. Extract the downloaded MATLAB class and support folders to a convenient location.
3. Add the folder to your MATLAB path, for example by adding the statement ```addpath('path/to/extracted/folder')``` to your ```startup.m``` file.
## Basic Usage Examples
### Fiji/ImageJ
1. Download the [sample HDF5 image](https://git-bioimage.coe.drexel.edu/opensource/leversc/-/raw/master/sampleImages/lscSampleImage.h5?inline=false "LEVERSC Sample HDF5 Image").
2. Choose ```Open...``` in Fiji and navigate to the downloaded image (or another dataset).
3. Once the image has loaded, select ```Plugins->3D->Leversc Viewer```, this should launch the LEVERSC program and load the image data for the selected frame.
### Python
1. Download or clone the full [LEVERSC repository](https://git-bioimage.coe.drexel.edu/opensource/leversc/-/archive/master/leversc-master.zip)
2. Navigate to the LEVERSC ```src/Python``` folder.
3. Run: ```python test_leversc.py```.
### MATLAB
1. Download or clone the full [LEVERSC repository](https://git-bioimage.coe.drexel.edu/opensource/leversc/-/archive/master/leversc-master.zip)
2. Launch MATLAB and navigate to the leversc ```src/MATLAB``` folder.
3. In the MATLAB terminal, run: ```sampleRotateVolumeMovie```.
## Further Details
Details of the LEVERSC tool architecture and API can be found in the [methods](docs/methods.md "LEVERSC online methods") section. The source code for LEVERSC integrations are available in this repository ([ImageJ plugin](src/ImageJ "ImageJ plugin source code"), [Python module](src/Python "Python module source code"), [MATLAB class](src/MATLAB "MATLAB class and helper source code")).
Additionally, the node.js and webgl source code for the LEVERJS/LEVERSC tool, as well as build instructions are available in the main [leverjs repository](https://git-bioimage.coe.drexel.edu/opensource/leverjs "LEVERJS source code repository").
# Leversc plugin for ImageJ
# LEVERSC plugin for ImageJ
## Building the ImageJ plugin
### Requirements
* Java Development Kit (JDK version 8 or higher)
......@@ -15,4 +15,4 @@ The jar file will be located in the ```target``` subdirectory and named ```Lever
## Installing the ImageJ plugin
Simply copy the plugin to the ```plugins/3D``` folder in your ImageJ executable directory. The plugin should then appear in the ImageJ menu as Plugins->3D->Leversc Viewer
**Note:** If using Fiji (highly recommended) then you may wish to create the ```3D``` subfolder in ```plugins``` and place the jar file within, as it can make it easier to find. Alternatively, the jar file can be placed directly in the ```plugins``` folder and will be listed in the Fiji menu as Plugsin->Leversc Viewer.
**Note:** If using Fiji (highly recommended) then you may wish to create the ```3D``` subfolder in ```plugins``` and place the jar file within, as it can make it easier to find. Alternatively, the jar file can be placed directly in the ```plugins``` folder and will be listed in the Fiji menu as Plugins->Leversc Viewer.
% sample script to generate a rotating volume movie
GENERATE_MOVIE=false;
if GENERATE_MOVIE
movieFile='lscSampleMovie.mp4';
v=VideoWriter(movieFile,'MPEG-4');
v.FrameRate=10;
open(v)
end
strDB='../../sampleImages/lscSampleImage.LEVER';
[im,CONSTANTS]=leversc.loadImage(strDB);
lsc=leversc(im);
lsc.uiParams.time='none';
lsc.uiParams.sidebar='none';
lsc.uiParams.clockButton='none';
lsc.uiParams.webToolbar='none';
lsc.viewParams.zoom=0.4;
% set a breakpoint on the for statement below, then set the
% size of the leversc window, set the zoom as needed,
% use 's' to toggle scale bar. once the view is setup right, let the script
% run
for theta=0:0.1:2*pi
ry=[cos(theta), 0, sin(theta),0;0,1,0,0;-sin(theta),0,cos(theta),0;0,0,0,1];
worldRot=ry;
lsc.viewParams.worldRot=worldRot(:);
% wait for leversc to
while ~lsc.drawComplete()
pause(0.1);
end
im=lsc.captureImage();
imagesc(im)
axis image
drawnow
if GENERATE_MOVIE
writeVideo(v,im);
drawnow
end
fprintf(1,'theta=%f complete\n',theta);
end
if GENERATE_MOVIE
close(v)
end
\ No newline at end of file
% sample script to generate a rotating volume movie
function sampleVolumeMovie(bPreview)
%% Load image (in this case from a LEVER file)
strDB='../../sampleImages/lscSampleImage.LEVER';
[im,CONSTANTS]=leversc.loadImage(strDB);
lsc=leversc(im, CONSTANTS.imageData);
%% Set up pre-selected render properties
lsc.renderParams(1).a = -1.3212;
lsc.renderParams(1).b = 2.2989;
lsc.renderParams(1).c = 0;
lsc.renderParams(1).minRange = 0;
lsc.renderParams(1).maxRange = 0.8700;
lsc.renderParams(1).color = [1;1;1];
%% Disable most UI elements
lsc.uiParams.time='none';
lsc.uiParams.sidebar='none';
lsc.uiParams.clockButton='none';
lsc.uiParams.webToolbar='none';
%% Reset view parameters (also set background to black)
lsc.viewParams.bClip = 0;
lsc.viewParams.worldRot = reshape([1,0,0,0; 0,1,0,0; 0,0,1,0; 0,0,0,1], 16,1);
lsc.viewParams.zoom = 0;
lsc.viewParams.bgColor = [0,0,0];
if ( ~bPreview )
movieFile='lscSampleMovie.mp4';
vidWrite=VideoWriter(movieFile,'MPEG-4');
vidWrite.FrameRate=10;
open(vidWrite)
end
%% Quick zoom from 0 to 0.4 (1 sec = 10 frames)
zlevels = linspace(0,0.4, 10);
for i=1:length(zlevels)
lsc.viewParams.zoom=zlevels(i);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
% Padding for 0.2 sec
if ( ~bPreview )
writeVideo(vidWrite,imCap);
writeVideo(vidWrite,imCap);
end
%% Rotate halfway around in 5sec
angles = linspace(0,180,50);
for i=2:length(angles)
ry=[cosd(angles(i)), 0, sind(angles(i)),0;
0,1,0,0;
-sind(angles(i)),0,cosd(angles(i)),0;
0,0,0,1];
worldRot=ry;
lsc.viewParams.worldRot=worldRot(:);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
% Padding for 0.2 sec
if ( ~bPreview )
writeVideo(vidWrite,imCap);
writeVideo(vidWrite,imCap);
end
%% Put sample plane at the back of the volume and set to planar clipping mode,
%% then animate plane movement into just back from center
planeZs = linspace(85,50, 20);
lsc.viewParams.planeCenter(3) = planeZs(1);
lsc.viewParams.bClip = 1; % Plane-clipping mode
for i=1:length(planeZs)
lsc.viewParams.planeCenter(3) = planeZs(i);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
% Padding for 0.2 sec
if ( ~bPreview )
writeVideo(vidWrite,imCap);
writeVideo(vidWrite,imCap);
end
%% Rotate the rest of the way around with plane clipping on (5sec)
% This time we use a "delta" rotation matrix to show how matrix
% products can be used
frameRots = 50;
angleDelta = 180 / frameRots;
deltaRY = [cosd(angleDelta), 0, sind(angleDelta),0;
0,1,0,0;
-sind(angleDelta),0,cosd(angleDelta),0;
0,0,0,1];
worldRot = reshape(lsc.viewParams.worldRot, 4,4);
for i=1:frameRots
worldRot = deltaRY * worldRot;
lsc.viewParams.worldRot = worldRot(:);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
% Padding for 0.2 sec
if ( ~bPreview )
writeVideo(vidWrite,imCap);
writeVideo(vidWrite,imCap);
end
%% Change to slice clipping mode and pull plane back toward camera
startZ = lsc.viewParams.planeCenter(3);
planeZs = linspace(startZ,1,20);
lsc.viewParams.bClip = 2; % Slice-clipping mode
for i=1:length(planeZs)
lsc.viewParams.planeCenter(3) = planeZs(i);
imCap = get_rendered_frame(lsc);
if ( bPreview )
imagesc(imCap);
axis image;
drawnow();
else
writeVideo(vidWrite,imCap);
end
end
if ( ~bPreview )
close(vidWrite)
end
end
function im = get_rendered_frame(lsc)
% We need to poll for draw-complete in order to avoid timeouts
while ~lsc.drawComplete()
pause(0.1);
end
im=lsc.captureImage();
end
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment