Updated notebook documentation and included an example metadata annotation notebook.
This commit is contained in:
@ -4,7 +4,17 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Import libraries and modules"
|
||||
"# Data integration workflow of experimental campaign\n",
|
||||
"\n",
|
||||
"In this notebook, we will go through a our data integration workflow. This involves the following steps:\n",
|
||||
"\n",
|
||||
"1. Specify data integration file through YAML configuration file.\n",
|
||||
"2. Create an integrated HDF5 file of experimental campaign from configuration file.\n",
|
||||
"3. Display the created HDF5 file using a treemap\n",
|
||||
"\n",
|
||||
"## Import libraries and modules\n",
|
||||
"\n",
|
||||
"* Excecute (or Run) the Cell below"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -20,14 +30,15 @@
|
||||
"sys.path.append(root_dir)\n",
|
||||
"\n",
|
||||
"import src.hdf5_vis as hdf5_vis\n",
|
||||
"import src.data_integration_lib as dilib\n"
|
||||
"import src.data_integration_lib as dilib\n",
|
||||
"import pathlib"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Specify data integration task through yaml configuration file\n",
|
||||
"## Step 1: Specify data integration task through YAML configuration file\n",
|
||||
"\n",
|
||||
"* Create your configuration file (i.e., *.yaml file) adhering to the example yaml file in the input folder.\n",
|
||||
"* Set up input directory and output directory paths and Excecute Cell.\n",
|
||||
@ -41,15 +52,33 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#output_filename_path = 'output_files/unified_file_smog_chamber_2024-04-07_UTC-OFST_+0200_NG.h5'\n",
|
||||
"yaml_config_file_path = 'input_files/data_integr_config_file_LI.yaml'\n",
|
||||
"output_filename_path = dilib.integrate_data_sources(yaml_config_file_path)\n"
|
||||
"yaml_config_file_path = 'input_files/data_integr_config_file_TBR.yaml'\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Display integrated HDF5 file using a treemap\n",
|
||||
"## Step 2: Create an integrated HDF5 file of experimental campaign.\n",
|
||||
"\n",
|
||||
"* Excecute Cell. Here we run the function `integrate_data_sources` with input argument as the previously specified YAML config file."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"\n",
|
||||
"hdf5_file_path = dilib.integrate_data_sources(yaml_config_file_path)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Display integrated HDF5 file using a treemap\n",
|
||||
"\n",
|
||||
"* Excecute Cell. A visual representation in html format of the integrated file should be displayed and stored in the output directory folder"
|
||||
]
|
||||
@ -60,13 +89,17 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"hdf5_vis.display_group_hierarchy_on_a_treemap(output_filename_path)"
|
||||
"if isinstance(hdf5_file_path ,list):\n",
|
||||
" for path_item in hdf5_file_path :\n",
|
||||
" hdf5_vis.display_group_hierarchy_on_a_treemap(path_item)\n",
|
||||
"else:\n",
|
||||
" hdf5_vis.display_group_hierarchy_on_a_treemap(hdf5_file_path)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"kernelspec": {
|
||||
"display_name": "test_atmos_chem_env",
|
||||
"display_name": "multiphase_chemistry_env",
|
||||
"language": "python",
|
||||
"name": "python3"
|
||||
},
|
||||
@ -84,5 +117,5 @@
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
"nbformat_minor": 4
|
||||
}
|
||||
|
@ -23,15 +23,6 @@
|
||||
"import src.metadata_review_lib as metadata_review_lib"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
@ -53,10 +44,10 @@
|
||||
"#hdf5_file_path = \"output_files/unified_file_smog_chamber_2024-03-25_UTC-OFST_+0100_NG.h5\"\n",
|
||||
"#yml_file_path = \"output_files/unified_file_smog_chamber_2024-03-25_UTC-OFST_+0100_NG.yaml\"\n",
|
||||
"\n",
|
||||
"hdf5_file_path = \"output_files/smog_chamber_study_2022-07-26_NatashaG.h5\"\n",
|
||||
"yml_file_path = \"output_files/smog_chamber_study_2022-07-26_NatashaG.yaml\"\n",
|
||||
"hdf5_file_path = \"output_files/kinetic_flowtube_study_2022-01-31_LuciaI.h5\"\n",
|
||||
"yml_file_path = \"output_files/kinetic_flowtube_study_2022-01-31_LuciaI.yaml\"\n",
|
||||
"\n",
|
||||
"reviewer_attrs = {'initials': 'JuanFO',\n",
|
||||
"reviewer_attrs = {'initials': 'LuciaI',\n",
|
||||
" 'type': 'metadata'}\n",
|
||||
"\n",
|
||||
"#output_filename_path, output_yml_filename_path = hdf5_lib.main()\n",
|
||||
@ -141,7 +132,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.11.6"
|
||||
"version": "3.11.9"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
|
Reference in New Issue
Block a user