23 Commits

Author SHA1 Message Date
75e490fa18 Implemented option to hide image when opening 2018-01-24 10:35:52 +01:00
70fa714aac Fixed build instructions 2018-01-24 10:33:54 +01:00
2372bf786f Update readme 2017-08-28 17:21:17 +02:00
cf0101a25d Increment version to 0.11.0 2017-08-28 17:15:37 +02:00
3c9fd2b97e Add quick and dirty hack
Quick and dirty hack to allow imagej (python) scripts to open hdf5 files via a routine and do operation with them.
This is far from perfect and needs major cleanup! It is implemented like this due to lack of time ...
2017-08-28 17:13:10 +02:00
eca2294000 Attemp to use plugin options 2017-08-28 16:19:28 +02:00
efc5891e2e update gitignore 2017-08-17 09:38:31 +02:00
ec38041ddf Fix artifact repository URL 2016-11-30 15:53:22 +01:00
8e5ca84a9e removed unecessary logging 2015-04-21 16:06:41 +02:00
d967b27d7c removed log messages 2015-04-21 15:15:06 +02:00
2e643fd215 fixed file close CTRLHA-109 2015-04-14 14:42:50 +02:00
fa892bc7fe Merge pull request #2 in IM/ch.psi.imagej.hdf5 from virtualstack to master
# By ebner
# Via ebner
* commit 'e5331e0c2d5d33dffee6906b5db5bda33f144e5a':
  Updated version CTRLHA-109
  Fixed memory leak that was introduced with the VirtualStack workaround - its still a workaround so CTRLHA-109
  tried to implement close of file CTRLHA-109
  implemented a HDF5 Virtual Stack CTRLHA-109
2015-04-14 14:18:09 +02:00
e5331e0c2d Updated version CTRLHA-109 2015-04-14 14:17:15 +02:00
bbefd328a8 Fixed memory leak that was introduced with the VirtualStack workaround - its still a workaround so CTRLHA-109 2015-04-14 14:15:13 +02:00
9ab9b8b355 tried to implement close of file CTRLHA-109 2015-04-14 13:55:19 +02:00
c46aa1f1ec implemented a HDF5 Virtual Stack CTRLHA-109 2015-04-14 13:38:48 +02:00
2464a795d1 Merge pull request #1 in IM/ch.psi.imagej.hdf5 from slice_selection to master
# By ebner
# Via ebner
* commit 'dfa00c8dc45a6dec5f6255bf4074f930a0a5589a':
  updated readme
  added support for reading only every x-th image
  Added support for slice / tested
  externalized Panel for dialog
  added gradle build file
2015-04-14 11:08:53 +02:00
dfa00c8dc4 updated readme 2015-04-14 11:06:27 +02:00
4aec3a2858 added support for reading only every x-th image 2015-04-14 11:02:25 +02:00
93885470e4 Added support for slice / tested 2015-04-14 10:38:38 +02:00
d7f6602944 externalized Panel for dialog 2015-04-14 09:53:27 +02:00
be80fb446a added gradle build file 2015-04-14 08:30:17 +02:00
e9d73ffde8 Fixed problem with that not all entries were shown ... 2014-10-09 16:38:10 +02:00
18 changed files with 993 additions and 121 deletions

3
.gitignore vendored
View File

@ -1 +1,4 @@
.idea
/target /target
.gradle
build

View File

@ -1,8 +1,10 @@
# Overview # Overview
ImageJ plugin for reading and writing HDF5 files. ImageJ plugin for reading and writing HDF5 files.
For 3D datasets an individual slice can be selected for visualization.
Also, especially for very big datasets only every x-th slice can be selected
for visualization. This can be done by either specifying a number, e.g. `10` (for the slice 10) or a number with a preceding %, e.g. `%10` (for every 10th image). Indexing starts at 0.
# Usage # Usage
@ -18,6 +20,16 @@ To save to an HDF5 file use:
File > SaveAs > HDF5 File > SaveAs > HDF5
``` ```
## Scripting
To use this plugin from the ImageJs' (python) scripting interface these lines can be used to open a dataset:
```python
from ch.psi.imagej.hdf5 import HDF5Reader
reader = HDF5Reader()
stack = reader.open("",False, "/Users/ebner/Desktop/A8_d_400N030_.h5", "/exchange/data_dark", True)
```
# Installation # Installation
To be able to install this plugin ImageJ need to be run with a Java 7 or greater JVM. To be able to install this plugin ImageJ need to be run with a Java 7 or greater JVM.
@ -77,11 +89,18 @@ cd <FIJI_HOME>
Starting with Java 8 just the LD_LIBRARY_PATH variable need to be set. For MacOSX it is export `DYLD_LIBRARY_PATH=lib/mac64/:$DYLD_LIBRARY_PATH`. Starting with Java 8 just the LD_LIBRARY_PATH variable need to be set. For MacOSX it is export `DYLD_LIBRARY_PATH=lib/mac64/:$DYLD_LIBRARY_PATH`.
# Development # Development
To be able to run the tests and plugin from within the following arguments need to be passed:
![RunSettings](RunSettings.png)
To create an all in one zip file for installation in a ImageJ installation use: To create an all in one zip file for installation in a ImageJ installation use:
`mvn clean compile assembly:assembly` `mvn clean compile assembly:assembly`
The zip file contains an all in one jar as well as the required native libraries for Windows, Linux and Mac OS X. The zip file contains an all in one jar as well as the required native libraries for Windows, Linux and Mac OS X.
Note: to be able to build the package you need to have access to the PSI artifact server. Therefore this only works within the PSI networks and if you have a certain maven configuration. An example Maven settings.xml that you can copy to `~/.m2/settings.xml` is located [here](settings.xml).
# Acknowledgements # Acknowledgements
This project was inspired by the ImageJ HDF Plugin of Matthias Schlachter Chair of Pattern Recognition and Image Processing, University of Freiburg, Germany ( https://code.google.com/p/imagej-hdf ) . This project was inspired by the ImageJ HDF Plugin of Matthias Schlachter Chair of Pattern Recognition and Image Processing, University of Freiburg, Germany ( https://code.google.com/p/imagej-hdf ) .
It is a complete rewrite of the code with the focus on efficiency and maintainability It is a complete rewrite of the code with the focus on efficiency and maintainability

BIN
RunSettings.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 358 KiB

30
build.gradle Normal file
View File

@ -0,0 +1,30 @@
apply plugin: 'java'
apply plugin: 'maven'
group = 'ch.psi'
version = '0.11.0'
description = """"""
sourceCompatibility = 1.7
targetCompatibility = 1.7
repositories {
mavenCentral()
maven { url "http://artifacts.psi.ch/artifactory/libs-releases" }
}
dependencies {
compile group: 'hdf5', name: 'hdf', version:'2.10.0'
compile group: 'hdf5', name: 'hdfobj', version:'2.10.0'
compile group: 'hdf5', name: 'hdf5', version:'2.10.0'
compile group: 'hdf5', name: 'hdf5obj', version:'2.10.0'
compile group: 'org.slf4j', name: 'slf4j-api', version:'1.7.6'
testCompile group: 'junit', name: 'junit', version:'4.11'
compile(group: 'gov.nih.imagej', name: 'imagej', version:'1.46') {
/* This dependency was originally in the Maven provided scope, but the project was not of type war.
This behavior is not yet supported by Gradle, so this dependency has been converted to a compile dependency.
Please review and delete this closure when resolved. */
}
}

BIN
gradle/wrapper/gradle-wrapper.jar vendored Normal file

Binary file not shown.

View File

@ -0,0 +1,6 @@
#Tue Apr 14 08:25:23 CEST 2015
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-2.3-bin.zip

164
gradlew vendored Executable file
View File

@ -0,0 +1,164 @@
#!/usr/bin/env bash
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn ( ) {
echo "$*"
}
die ( ) {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
esac
# For Cygwin, ensure paths are in UNIX format before anything is touched.
if $cygwin ; then
[ -n "$JAVA_HOME" ] && JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
fi
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >&-
APP_HOME="`pwd -P`"
cd "$SAVED" >&-
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Split up the JVM_OPTS And GRADLE_OPTS values into an array, following the shell quoting and substitution rules
function splitJvmOpts() {
JVM_OPTS=("$@")
}
eval splitJvmOpts $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS
JVM_OPTS[${#JVM_OPTS[*]}]="-Dorg.gradle.appname=$APP_BASE_NAME"
exec "$JAVACMD" "${JVM_OPTS[@]}" -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"

90
gradlew.bat vendored Normal file
View File

@ -0,0 +1,90 @@
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windowz variants
if not "%OS%" == "Windows_NT" goto win9xME_args
if "%@eval[2+2]" == "4" goto 4NT_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
goto execute
:4NT_args
@rem Get arguments from the 4NT Shell from JP Software
set CMD_LINE_ARGS=%$
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

View File

@ -3,7 +3,7 @@
<modelVersion>4.0.0</modelVersion> <modelVersion>4.0.0</modelVersion>
<groupId>ch.psi</groupId> <groupId>ch.psi</groupId>
<artifactId>imagej.hdf5</artifactId> <artifactId>imagej.hdf5</artifactId>
<version>0.7.0</version> <version>0.12.0</version>
<dependencies> <dependencies>
<dependency> <dependency>

44
settings.xml Normal file
View File

@ -0,0 +1,44 @@
<?xml version="1.0" encoding="UTF-8"?>
<settings xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd" xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<profiles>
<profile>
<repositories>
<repository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>libs-releases</name>
<url>http://artifacts.psi.ch/artifactory/libs-releases</url>
</repository>
<repository>
<snapshots />
<id>snapshots</id>
<name>libs-snapshots</name>
<url>http://artifacts.psi.ch/artifactory/libs-snapshots</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>libs-releases</name>
<url>http://artifacts.psi.ch/artifactory/libs-releases</url>
</pluginRepository>
<pluginRepository>
<snapshots />
<id>snapshots</id>
<name>libs-releases</name>
<url>http://artifacts.psi.ch/artifactory/libs-releases</url>
</pluginRepository>
</pluginRepositories>
<id>artifactory</id>
</profile>
</profiles>
<activeProfiles>
<activeProfile>artifactory</activeProfile>
</activeProfiles>
</settings>

View File

@ -0,0 +1,47 @@
package ch.psi.imagej.hdf5;
import java.util.ArrayList;
import java.util.List;
import ncsa.hdf.object.Dataset;
public class DatasetSelection {
private List<Dataset> datasets = new ArrayList<Dataset>();
private boolean group = false;
private Integer slice;
// Intervall to read images
private Integer modulo;
private boolean virtualStack;
public List<Dataset> getDatasets() {
return datasets;
}
public void setDatasets(List<Dataset> datasets) {
this.datasets = datasets;
}
public boolean isGroup() {
return group;
}
public void setGroup(boolean group) {
this.group = group;
}
public void setSlice(Integer slice) {
this.slice = slice;
}
public Integer getSlice() {
return slice;
}
public void setModulo(Integer modulo) {
this.modulo = modulo;
}
public Integer getModulo() {
return modulo;
}
public void setVirtualStack(boolean virtualStack) {
this.virtualStack = virtualStack;
}
public boolean isVirtualStack(){
return this.virtualStack;
}
}

View File

@ -10,20 +10,9 @@ import ij.plugin.PlugIn;
import java.io.File; import java.io.File;
import java.lang.reflect.Array; import java.lang.reflect.Array;
import java.util.ArrayList; import java.util.*;
import java.util.List;
import java.util.logging.Level; import java.util.logging.Level;
import java.util.logging.Logger; import java.util.logging.Logger;
import java.awt.*;
import javax.swing.BoxLayout;
import javax.swing.DefaultListCellRenderer;
import javax.swing.JCheckBox;
import javax.swing.JLabel;
import javax.swing.JList;
import javax.swing.JPanel;
import javax.swing.JScrollPane;
import javax.swing.ScrollPaneConstants;
import ncsa.hdf.object.*; import ncsa.hdf.object.*;
import ncsa.hdf.object.h5.*; import ncsa.hdf.object.h5.*;
@ -39,41 +28,88 @@ public class HDF5Reader implements PlugIn {
*/ */
public static void main(String[] args){ public static void main(String[] args){
HDF5Reader r = new HDF5Reader(); HDF5Reader r = new HDF5Reader();
r.run(""); // r.run("");
r.open("",false, "/Users/ebner/Desktop/A8_d_400N030_.h5", "/exchange/data", true);
} }
public void run(String arg) {
open(arg, true, null, null, true);
}
/** /**
* Main function plugin * Main function plugin
* arg is a space separated list of arguments that can be passed to the run method.
* arg looks something like this: "para1=value1 para2=value2 ....."
*
* Supported arguments for arg:
* open=&lt;path&gt;
* dataset=/your/path/to/dataset
*
*/ */
public void run(String arg) {
OpenDialog od = new OpenDialog("Open HDF5 ...", arg); public ImageStack open(String arg, boolean interactive, String filename, String nameOfDataset, boolean virtualstack) {
return open(arg, interactive, filename, nameOfDataset, virtualstack, true);
}
public ImageStack open(String arg, boolean interactive, String filename, String nameOfDataset, boolean virtualstack, boolean showImage) {
File tfile = new File(od.getDirectory() + od.getFileName());
if (!tfile.exists() || !tfile.canRead()) { // Map arguments = HDF5Reader.parseArguments(arg);
IJ.showMessage("Cannot open file: "+tfile.getAbsolutePath());
return; File tfile = null;
if(interactive) {
OpenDialog od = new OpenDialog("Open HDF5 ...", arg);
tfile = new File(od.getDirectory() + od.getFileName());
if (!tfile.exists() || !tfile.canRead()) {
IJ.showMessage("Cannot open file: "+tfile.getAbsolutePath());
return null;
}
// Overwrite filename with selected filename
filename = tfile.getAbsolutePath();
IJ.showStatus("Loading HDF5 File: " + filename);
IJ.showProgress(0.0);
} }
String filename = tfile.getAbsolutePath();
IJ.showStatus("Loading HDF5 File: " + filename);
IJ.showProgress(0.0);
// Read HDF5 file // Read HDF5 file
H5File file = null; H5File file = null;
boolean close = true;
List<ImageStack> stacks = new ArrayList<>();
ImageStack stack = null;
try { try {
file = new H5File(filename, H5File.READ); file = new H5File(filename, H5File.READ);
file.setMaxMembers(Integer.MAX_VALUE);
file.open(); file.open();
List<Dataset> datasets = HDF5Utilities.getDatasets(file); List<Dataset> datasets = HDF5Utilities.getDatasets(file);
SelectedDatasets selectedDatasets = selectDatasets(datasets);
DatasetSelection selectedDatasets = null;
if(interactive){
logger.info("Using manual selection");
// Manual selection of the dataset and other parameters via a dialog
selectedDatasets = selectDatasets(datasets);
}
else{
logger.info("Using automatic selection");
selectedDatasets = new DatasetSelection();
for(Dataset dataset: datasets){
if(dataset.getFullName().equals(nameOfDataset)){
selectedDatasets.getDatasets().add(dataset);
break; // we only support one selection for the time being
}
}
selectedDatasets.setVirtualStack(virtualstack);
}
// TODO to be removed - Workaround virtual stack - keep HDF5 file open at the end
close=!selectedDatasets.isVirtualStack();
// TODO Remove // TODO Remove
// Hack as a proof of principle // Hack as a proof of principle
if(selectedDatasets.isGroup()){ if(selectedDatasets.isGroup()){
ImageStack stack = null;
for (Dataset var : selectedDatasets.getDatasets()) { for (Dataset var : selectedDatasets.getDatasets()) {
if(stack == null){ if(stack == null){
@ -87,11 +123,14 @@ public class HDF5Reader implements PlugIn {
ImagePlus imp = new ImagePlus(filename, stack); ImagePlus imp = new ImagePlus(filename, stack);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
return; imp.show();
}
stacks.add(stack);
return stack; // TODO should return stacks instead of stack
} }
for (Dataset var : selectedDatasets.getDatasets()) { for (Dataset var : selectedDatasets.getDatasets()) {
// Read dataset attributes and properties // Read dataset attributes and properties
@ -118,7 +157,8 @@ public class HDF5Reader implements PlugIn {
Object wholeDataset = var.read(); Object wholeDataset = var.read();
ImageStack stack = new ImageStack((int) dimensions[3], (int) dimensions[2]); stack = new ImageStack((int) dimensions[3], (int) dimensions[2]);
stacks.add(stack);
int stackSize = (int) (dimensions[2] * dimensions[3] * 3); int stackSize = (int) (dimensions[2] * dimensions[3] * 3);
int singleVolumeSize = (int) (dimensions[1] * stackSize); int singleVolumeSize = (int) (dimensions[1] * stackSize);
for (int volIDX = 0; volIDX < dimensions[0]; ++volIDX) { for (int volIDX = 0; volIDX < dimensions[0]; ++volIDX) {
@ -133,7 +173,9 @@ public class HDF5Reader implements PlugIn {
imp = new CompositeImage(imp, CompositeImage.COMPOSITE); imp = new CompositeImage(imp, CompositeImage.COMPOSITE);
imp.setOpenAsHyperStack(true); imp.setOpenAsHyperStack(true);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
imp.show();
}
} else if (numberOfDimensions == 4 && dimensions[3] == 3) { } else if (numberOfDimensions == 4 && dimensions[3] == 3) {
logger.info("3D RGB Image"); logger.info("3D RGB Image");
@ -148,7 +190,8 @@ public class HDF5Reader implements PlugIn {
Object wholeDataset = var.read(); Object wholeDataset = var.read();
ImageStack stack = new ImageStack((int) dimensions[2], (int) dimensions[1]); stack = new ImageStack((int) dimensions[2], (int) dimensions[1]);
stacks.add(stack);
int stackSize = (int) (dimensions[1] * dimensions[2] * 3); int stackSize = (int) (dimensions[1] * dimensions[2] * 3);
for (int lev = 0; lev < dimensions[0]; ++lev) { for (int lev = 0; lev < dimensions[0]; ++lev) {
int startIdx = lev * stackSize; int startIdx = lev * stackSize;
@ -160,7 +203,9 @@ public class HDF5Reader implements PlugIn {
imp = new CompositeImage(imp, CompositeImage.COMPOSITE); imp = new CompositeImage(imp, CompositeImage.COMPOSITE);
imp.setOpenAsHyperStack(true); imp.setOpenAsHyperStack(true);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
imp.show();
}
} else if (numberOfDimensions == 4) { } else if (numberOfDimensions == 4) {
logger.info("4D Image (HyperVolume)"); logger.info("4D Image (HyperVolume)");
@ -175,7 +220,8 @@ public class HDF5Reader implements PlugIn {
Object wholeDataset = var.read(); Object wholeDataset = var.read();
ImageStack stack = new ImageStack((int) dimensions[3], (int) dimensions[2]); stack = new ImageStack((int) dimensions[3], (int) dimensions[2]);
stacks.add(stack);
int size = (int) (dimensions[2] * dimensions[3]); int size = (int) (dimensions[2] * dimensions[3]);
long singleVolumeSize = dimensions[1] * size; long singleVolumeSize = dimensions[1] * size;
for (int volIDX = 0; volIDX < dimensions[0]; ++volIDX) { for (int volIDX = 0; volIDX < dimensions[0]; ++volIDX) {
@ -189,7 +235,9 @@ public class HDF5Reader implements PlugIn {
imp.setDimensions(1, (int) dimensions[1], (int) dimensions[0]); imp.setDimensions(1, (int) dimensions[1], (int) dimensions[0]);
imp.setOpenAsHyperStack(true); imp.setOpenAsHyperStack(true);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
imp.show();
}
} else if (numberOfDimensions == 3 && dimensions[2] == 3) { } else if (numberOfDimensions == 3 && dimensions[2] == 3) {
logger.info("2D RGB Image"); logger.info("2D RGB Image");
@ -203,7 +251,8 @@ public class HDF5Reader implements PlugIn {
Object wholeDataset = var.read(); Object wholeDataset = var.read();
ImageStack stack = new ImageStack((int) dimensions[1], (int) dimensions[0]); stack = new ImageStack((int) dimensions[1], (int) dimensions[0]);
stacks.add(stack);
addSliceRGB(stack, wholeDataset, (int) dimensions[0], (int) dimensions[1]); addSliceRGB(stack, wholeDataset, (int) dimensions[0], (int) dimensions[1]);
ImagePlus imp = new ImagePlus(filename + " " + datasetName, stack); ImagePlus imp = new ImagePlus(filename + " " + datasetName, stack);
@ -211,41 +260,101 @@ public class HDF5Reader implements PlugIn {
imp = new CompositeImage(imp, CompositeImage.COMPOSITE); imp = new CompositeImage(imp, CompositeImage.COMPOSITE);
imp.setOpenAsHyperStack(true); imp.setOpenAsHyperStack(true);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
imp.show();
}
} else if (numberOfDimensions == 3) { } else if (numberOfDimensions == 3) {
logger.info("3D Image"); logger.info("3D Image");
// Select what to readout if(selectedDatasets.isVirtualStack()){
long[] selected = var.getSelectedDims(); logger.info("Use virtual stack");
selected[0] = dimensions[0]; stack = new VirtualStackHDF5(file, var);
selected[1] = dimensions[1]; }
selected[2] = dimensions[2]; else{
if(selectedDatasets.getSlice()!=null){
Object wholeDataset = var.read();
// Select what to readout
ImageStack stack = new ImageStack((int) dimensions[2], (int) dimensions[1]); long[] selected = var.getSelectedDims();
int size = (int) (dimensions[1] * dimensions[2]); selected[0] = 1;
for (int lev = 0; lev < dimensions[0]; ++lev) { selected[1] = dimensions[1];
int startIdx = lev * size; selected[2] = dimensions[2];
addSlice(stack, wholeDataset, startIdx, size);
long[] start = var.getStartDims();
start[0] = selectedDatasets.getSlice();
Object wholeDataset = var.read();
stack = new ImageStack((int) dimensions[2], (int) dimensions[1]);
int size = (int) (dimensions[1] * dimensions[2]);
// int startIdx = selectedDatasets.getSlice() * size;
addSlice(stack, wholeDataset, 0, size);
}
else if(selectedDatasets.getModulo()!=null){
logger.info("Read every "+selectedDatasets.getModulo()+" image");
// Select what to readout
stack = new ImageStack((int) dimensions[2], (int) dimensions[1]);
for(int indexToRead=0;indexToRead<dimensions[0]; indexToRead=indexToRead+selectedDatasets.getModulo()){
long[] selected = var.getSelectedDims();
selected[0] = 1;
selected[1] = dimensions[1];
selected[2] = dimensions[2];
long[] start = var.getStartDims();
start[0] = indexToRead;
Object wholeDataset = var.read();
int size = (int) (dimensions[1] * dimensions[2]);
// int startIdx = selectedDatasets.getSlice() * size;
addSlice(stack, wholeDataset, 0, size);
}
}
else{
// Select what to readout
long[] selected = var.getSelectedDims();
selected[0] = dimensions[0];
selected[1] = dimensions[1];
selected[2] = dimensions[2];
Object wholeDataset = var.read();
stack = new ImageStack((int) dimensions[2], (int) dimensions[1]);
int size = (int) (dimensions[1] * dimensions[2]);
for (int lev = 0; lev < dimensions[0]; ++lev) {
int startIdx = lev * size;
addSlice(stack, wholeDataset, startIdx, size);
}
}
} }
ImagePlus imp = new ImagePlus(filename + " " + datasetName, stack); stacks.add(stack);
ImagePlus imp = new ImagePlusHDF5(filename + " " + datasetName, stack);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
imp.show();
}
} else if (numberOfDimensions == 2) { } else if (numberOfDimensions == 2) {
logger.info("2D Image"); logger.info("2D Image");
Object wholeDataset = var.read(); Object wholeDataset = var.read();
ImageStack stack = new ImageStack((int) dimensions[1], (int) dimensions[0]); stack = new ImageStack((int) dimensions[1], (int) dimensions[0]);
stacks.add(stack);
addSlice(stack, wholeDataset); addSlice(stack, wholeDataset);
ImagePlus imp = new ImagePlus(filename + " " + datasetName, stack); ImagePlus imp = new ImagePlus(filename + " " + datasetName, stack);
imp.resetDisplayRange(); imp.resetDisplayRange();
imp.show(); if(showImage) {
imp.show();
}
} else { } else {
IJ.showStatus("Variable Dimension " + numberOfDimensions + " not supported"); IJ.showStatus("Variable Dimension " + numberOfDimensions + " not supported");
@ -259,8 +368,11 @@ public class HDF5Reader implements PlugIn {
IJ.outOfMemory("Out of memory while loading file: " + filename); IJ.outOfMemory("Out of memory while loading file: " + filename);
} finally { } finally {
try { try {
if (file != null) { // TODO workaround - to be removed
file.close(); if(close){
if (file != null) {
file.close();
}
} }
} catch (HDF5Exception e) { } catch (HDF5Exception e) {
logger.log(Level.WARNING, "Error while closing: " + filename, e); logger.log(Level.WARNING, "Error while closing: " + filename, e);
@ -269,6 +381,8 @@ public class HDF5Reader implements PlugIn {
} }
IJ.showProgress(1.0); IJ.showProgress(1.0);
return stack; // TODO should return stacks instead of stack
} }
/** /**
@ -278,39 +392,12 @@ public class HDF5Reader implements PlugIn {
* @return List of datasets to visualize. If nothing selected the list will be empty * @return List of datasets to visualize. If nothing selected the list will be empty
* @throws HDF5Exception * @throws HDF5Exception
*/ */
private SelectedDatasets selectDatasets(List<Dataset> datasets) throws HDF5Exception { private DatasetSelection selectDatasets(List<Dataset> datasets) throws HDF5Exception {
GenericDialog gd = new GenericDialog("Variable Name Selection"); GenericDialog gd = new GenericDialog("Variable Name Selection");
gd.addMessage("Please select variables to be loaded.\n"); gd.addMessage("Please select variables to be loaded.\n");
// Filter datasets that are not potential images / that cannot be displayed SelectionPanel panel = new SelectionPanel(datasets);
List<Dataset> fdatasets = new ArrayList<Dataset>();
for(Dataset d: datasets){
if(d.getRank()>=2 && d.getRank()<=5){
fdatasets.add(d);
}
}
JList<Dataset> list = new JList<>(fdatasets.toArray(new Dataset[fdatasets.size()]));
list.setCellRenderer(new DefaultListCellRenderer() {
private static final long serialVersionUID = 1L;
public Component getListCellRendererComponent(JList<?> list, Object value, int index, boolean isSelected, boolean cellHasFocus) {
JLabel label = (JLabel) super.getListCellRendererComponent(list, value, index, isSelected, cellHasFocus);
final Dataset d = ((Dataset) value);
label.setText(d.getFullName()+" ("+d.getRank()+"D)");
return label;
}
});
JScrollPane scroll = new JScrollPane(list);
scroll.setVerticalScrollBarPolicy(ScrollPaneConstants.VERTICAL_SCROLLBAR_ALWAYS);
JPanel panel = new JPanel();
panel.setLayout(new BoxLayout(panel,BoxLayout.Y_AXIS));
panel.add(scroll);
JCheckBox checkbox = new JCheckBox("Group Datasets (2D datasets only)");
panel.add(checkbox);
gd = new GenericDialog("Variable Name Selection"); gd = new GenericDialog("Variable Name Selection");
gd.add(panel); gd.add(panel);
@ -318,16 +405,18 @@ public class HDF5Reader implements PlugIn {
gd.pack(); gd.pack();
gd.showDialog(); gd.showDialog();
SelectedDatasets selectedDatasets = new SelectedDatasets(); DatasetSelection selectedDatasets = new DatasetSelection();
if (!gd.wasCanceled()) { if (!gd.wasCanceled()) {
selectedDatasets.setDatasets(list.getSelectedValuesList()); selectedDatasets.setDatasets(panel.getSelectedValues());
selectedDatasets.setGroup(checkbox.isSelected()); selectedDatasets.setGroup(panel.groupValues());
selectedDatasets.setSlice(panel.getSlice());
selectedDatasets.setModulo(panel.getModulo());
selectedDatasets.setVirtualStack(panel.useVirtualStack());
} }
return selectedDatasets; return selectedDatasets;
} }
/** /**
* Add slice to image stack * Add slice to image stack
* @param stack Stack to add slice * @param stack Stack to add slice
@ -407,4 +496,22 @@ public class HDF5Reader implements PlugIn {
stack.addSlice(null, g); stack.addSlice(null, g);
stack.addSlice(null, b); stack.addSlice(null, b);
} }
public static Map<String,String> parseArguments(String arg){
/// ImageJ arguments look something like this: "para1=value1 para2=value2 ....."
Map<String,String> map = new HashMap<>();
arg = arg.trim();
for(String argument: arg.split("\\s+")){
String[] entry = argument.split("=");
if(entry.length==2) {
map.put(entry[0], entry[1]);
}
else{
// ignore
logger.warning("Cannot parse argument " + argument + " - Ignore");
}
}
return map;
}
} }

View File

@ -0,0 +1,70 @@
package ch.psi.imagej.hdf5;
import java.awt.event.WindowEvent;
import java.awt.event.WindowListener;
import java.util.logging.Logger;
import ij.ImagePlus;
import ij.ImageStack;
public class ImagePlusHDF5 extends ImagePlus {
private static final Logger logger = Logger.getLogger(ImagePlusHDF5.class.getName());
private VirtualStackHDF5 stack;
public ImagePlusHDF5(String title, ImageStack stack) {
super(title, stack);
if(stack instanceof VirtualStackHDF5){
logger.info("VirtualStackHDF5");
this.stack = (VirtualStackHDF5) stack;
}
}
@Override
public void show() {
super.show();
getWindow().addWindowListener(new WindowListener() {
@Override
public void windowOpened(WindowEvent e) {
// logger.info("");
}
@Override
public void windowIconified(WindowEvent e) {
// logger.info("");
}
@Override
public void windowDeiconified(WindowEvent e) {
// logger.info("");
}
@Override
public void windowDeactivated(WindowEvent e) {
// logger.info("");
}
@Override
public void windowClosing(WindowEvent e) {
// logger.info("Closing");
}
@Override
public void windowClosed(WindowEvent e) {
// logger.info("Closed");
if(stack!=null){
stack.close();
}
}
@Override
public void windowActivated(WindowEvent e) {
// logger.info("");
}
});
}
}

View File

@ -1,26 +0,0 @@
package ch.psi.imagej.hdf5;
import java.util.ArrayList;
import java.util.List;
import ncsa.hdf.object.Dataset;
public class SelectedDatasets {
private List<Dataset> datasets = new ArrayList<Dataset>();
private boolean group = false;
public List<Dataset> getDatasets() {
return datasets;
}
public void setDatasets(List<Dataset> datasets) {
this.datasets = datasets;
}
public boolean isGroup() {
return group;
}
public void setGroup(boolean group) {
this.group = group;
}
}

View File

@ -0,0 +1,111 @@
package ch.psi.imagej.hdf5;
import java.awt.Component;
import java.util.ArrayList;
import java.util.List;
import javax.swing.BoxLayout;
import javax.swing.DefaultListCellRenderer;
import javax.swing.DefaultListModel;
import javax.swing.JCheckBox;
import javax.swing.JLabel;
import javax.swing.JList;
import javax.swing.JPanel;
import javax.swing.JScrollPane;
import javax.swing.ScrollPaneConstants;
import ncsa.hdf.object.Dataset;
import javax.swing.JTextField;
import java.awt.FlowLayout;
public class SelectionPanel extends JPanel {
private static final long serialVersionUID = 1L;
private final JList<Dataset> list;
private JCheckBox checkbox;
private JCheckBox checkBoxVirtualStack;
private JLabel lblSlice;
private JPanel panel;
private JTextField textField;
public SelectionPanel(){
this(new ArrayList<Dataset>());
}
public SelectionPanel(List<Dataset> datasets){
// Filter datasets that are not potential images / that cannot be displayed
List<Dataset> fdatasets = new ArrayList<Dataset>();
for(Dataset d: datasets){
if(d.getRank()>=2 && d.getRank()<=5){
fdatasets.add(d);
}
}
list = new JList<>(new DefaultListModel<Dataset>());
list.setListData(fdatasets.toArray(new Dataset[fdatasets.size()]));
list.setCellRenderer(new DefaultListCellRenderer() {
private static final long serialVersionUID = 1L;
public Component getListCellRendererComponent(JList<?> list, Object value, int index, boolean isSelected, boolean cellHasFocus) {
JLabel label = (JLabel) super.getListCellRendererComponent(list, value, index, isSelected, cellHasFocus);
final Dataset d = ((Dataset) value);
label.setText(d.getFullName()+" ("+d.getRank()+"D)");
return label;
}
});
list.setSelectedIndex(0);
JScrollPane scroll = new JScrollPane(list);
scroll.setVerticalScrollBarPolicy(ScrollPaneConstants.VERTICAL_SCROLLBAR_ALWAYS);
setLayout(new BoxLayout(this,BoxLayout.Y_AXIS));
add(scroll);
checkbox = new JCheckBox("Group Datasets (2D datasets only)");
add(checkbox);
checkBoxVirtualStack = new JCheckBox("Virtual Stack");
checkBoxVirtualStack.setSelected(true);
add(checkBoxVirtualStack);
panel = new JPanel();
FlowLayout flowLayout = (FlowLayout) panel.getLayout();
flowLayout.setAlignment(FlowLayout.LEFT);
add(panel);
lblSlice = new JLabel("Slice (3D only):");
panel.add(lblSlice);
textField = new JTextField();
panel.add(textField);
textField.setColumns(10);
}
public List<Dataset> getSelectedValues(){
return list.getSelectedValuesList();
}
public boolean groupValues(){
return checkbox.isSelected();
}
public Integer getSlice(){
String text = textField.getText();
if(text.matches("^[0-9]+$")){
return new Integer(text);
}
return null;
}
public Integer getModulo(){
String text = textField.getText();
if(text.matches("^%[0-9]+$")){
return new Integer(text.replace("%", ""));
}
return null;
}
public boolean useVirtualStack(){
return checkBoxVirtualStack.isSelected();
}
}

View File

@ -0,0 +1,174 @@
package ch.psi.imagej.hdf5;
import java.util.logging.Level;
import java.util.logging.Logger;
import ncsa.hdf.object.Dataset;
import ncsa.hdf.object.h5.H5File;
import ij.ImageStack;
import ij.process.ByteProcessor;
import ij.process.ColorProcessor;
import ij.process.FloatProcessor;
import ij.process.ImageProcessor;
import ij.process.ShortProcessor;
public class VirtualStackHDF5 extends ImageStack {
private static final Logger logger = Logger.getLogger(VirtualStackHDF5.class.getName());
private int bitDepth = 0;
private Dataset dataset;
private H5File file;
public VirtualStackHDF5(H5File file, Dataset dataset){
super((int) dataset.getDims()[2], (int) dataset.getDims()[1]);
this.dataset = dataset;
this.file = file;
}
/** Does noting. */
public void addSlice(String sliceLabel, Object pixels) {
}
/** Does nothing.. */
public void addSlice(String sliceLabel, ImageProcessor ip) {
}
/** Does noting. */
public void addSlice(String sliceLabel, ImageProcessor ip, int n) {
}
/** Does noting. */
public void deleteSlice(int n) {
}
/** Does noting. */
public void deleteLastSlice() {
}
public Object getPixels(int slice) {
try {
long[] dimensions = dataset.getDims();
// Select what to readout
long[] selected = dataset.getSelectedDims();
selected[0] = 1;
selected[1] = dimensions[1];
selected[2] = dimensions[2];
long[] start = dataset.getStartDims();
start[0] = slice-1; // Indexing at image J starts at 1
Object wholeDataset = dataset.read();
if (wholeDataset instanceof byte[]) {
return (byte[]) wholeDataset;
} else if (wholeDataset instanceof short[]) {
return (short[]) wholeDataset;
} else if (wholeDataset instanceof int[]) {
return HDF5Utilities.convertToFloat((int[]) wholeDataset);
} else if (wholeDataset instanceof long[]) {
return HDF5Utilities.convertToFloat((long[]) wholeDataset);
} else if (wholeDataset instanceof float[]) {
return (float[]) wholeDataset;
} else if (wholeDataset instanceof double[]) {
return HDF5Utilities.convertToFloat((double[]) wholeDataset);
} else {
logger.warning("Datatype not supported");
}
} catch (OutOfMemoryError | Exception e) {
logger.log(Level.WARNING, "Unable to open slice", e);
}
return null;
}
/**
* Assigns a pixel array to the specified slice, were 1<=n<=nslices.
*/
public void setPixels(Object pixels, int n) {
}
/**
* Returns an ImageProcessor for the specified slice, were 1<=n<=nslices.
* Returns null if the stack is empty.
*/
public ImageProcessor getProcessor(int slice) {
long[] dimensions = dataset.getDims();
final Object pixels = getPixels(slice);
// Todo support more ImageProcessor types
ImageProcessor ip;
if (pixels instanceof byte[]){
ip = new ByteProcessor((int) dimensions[2], (int) dimensions[1]);
}
else if (pixels instanceof short[]){
ip = new ShortProcessor((int) dimensions[2], (int) dimensions[1]);
}
else if (pixels instanceof int[]){
ip = new ColorProcessor((int) dimensions[2], (int) dimensions[1]);
}
else if (pixels instanceof float[]){
ip = new FloatProcessor((int) dimensions[2], (int) dimensions[1]);
}
else {
throw new IllegalArgumentException("Unknown stack type");
}
ip.setPixels(pixels);
return ip;
}
/** Returns the number of slices in this stack. */
public int getSize() {
return (int) this.dataset.getDims()[0];
}
/** Returns the label of the Nth image. */
public String getSliceLabel(int slice) {
return "Slice: "+slice;
}
/** Returns null. */
public Object[] getImageArray() {
return null;
}
/** Does nothing. */
public void setSliceLabel(String label, int n) {
}
/** Always return true. */
public boolean isVirtual() {
return true;
}
/** Does nothing. */
public void trim() {
}
/**
* Returns the bit depth (8, 16, 24 or 32), or 0 if the bit depth is not
* known.
*/
public int getBitDepth() {
return bitDepth;
}
/**
* Close HDF5 file
*/
public void close() {
logger.info("Closing HDF5 file");
try{
file.close();
}
catch(Exception e){
logger.log(Level.WARNING, "Unable to close HDF5 file", e);
}
}
}

View File

@ -0,0 +1,22 @@
package ch.psi.imagej.hdf5;
import org.junit.Test;
import java.util.Map;
import static org.junit.Assert.*;
/**
*
*/
public class HDF5ReaderTest {
@Test
public void parseArguments() throws Exception {
Map map = HDF5Reader.parseArguments("para1=value1 para2=value2 PARA=VAL");
assertTrue(map.get("para1").equals("value1"));
assertTrue(map.get("para2").equals("value2"));
assertTrue(map.get("PARA").equals("VAL"));
}
}

View File

@ -2,6 +2,7 @@ package ch.psi.imagej.hdf5;
import static org.junit.Assert.*; import static org.junit.Assert.*;
import ij.IJ;
import org.junit.Test; import org.junit.Test;
public class HDF5UtilitiesTest { public class HDF5UtilitiesTest {
@ -22,4 +23,14 @@ public class HDF5UtilitiesTest {
assertEquals(gdescriptor, "three"); assertEquals(gdescriptor, "three");
} }
@Test
public void testOpen() {
IJ.run("HDF5...");
String descriptor = "/test/one/two/three";
String gdescriptor = HDF5Utilities.getDatasetName(descriptor);
System.out.println(gdescriptor);
assertEquals(gdescriptor, "three");
}
} }