diff --git a/edex/distributed-computing/index.html b/edex/distributed-computing/index.html
index c56bb97b0b..2aa4c606ec 100644
--- a/edex/distributed-computing/index.html
+++ b/edex/distributed-computing/index.html
@@ -1075,8 +1075,8 @@
Distributed EDEX
-
Currently, with our specific EDEX server we use a Database/Request instance that also decodes and ingests a good portion of the data. It handles all data requests from CAVE users, as well as the majority of the decoding and ingesting for data feeds coming down on the LDM. The radar data has been specifically exluded (from the decoding and ingest) and it has its own Ingest/Decode Server which is explained in more detail below.
-For our EDEX we have designated an instance of the ingest/decoding server to be dedicated to handling the radar data. Our Radar-EDEX recieves and decodes all radar down from the LDM and then stores it back on our main Database/Request EDEX in the form of HDF5 data files and PostgreSQL metadata.
+Currently, we use a distributed architecture comprised of three machines: 1 main EDEX machine and 2 ancillary EDEX machines. The main EDEX machine decodes and processes the majority of the data, while serving and storing all of the data. Our two ancillary machines -- one for radar data and one for satellite data -- each decode and process a subset of the data and send it back to the main EDEX for storage and requesting.
+The main EDEX is an instance of our Database and Request Server and more information on our ancillary EDEX machines is below as well.
Example Installation
This walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to store and serve while the second is used to ingest and decode data.
diff --git a/search/search_index.json b/search/search_index.json
index 8eb58db2bb..6d2c5e6334 100644
--- a/search/search_index.json
+++ b/search/search_index.json
@@ -1 +1 @@
-{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Unidata AWIPS User Manual \uf0c1 https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the Unidata Program Center (UPC) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. Unidata supports two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and python-awips (a python package). Note : Our version of CAVE is a non-operational version. It does not support some features of NWS AWIPS. Warnings and alerts cannot be issued from Unidata's CAVE. Additional functionality may not be available as well. Download and Install CAVE \uf0c1 Download and Install EDEX \uf0c1 Work with Python-AWIPS \uf0c1 License \uf0c1 Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS license information can be found here . AWIPS Data in the Cloud \uf0c1 Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after). Distributed Computing \uf0c1 AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, Unidata modified the package to be more applicable in the University setting. Because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX Software Components \uf0c1 EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES EDEX \uf0c1 The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX CAVE \uf0c1 Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE LDM \uf0c1 https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop edexBridge \uf0c1 edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing. Qpid \uf0c1 http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd PostgreSQL \uf0c1 http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres HDF5 \uf0c1 http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ . PyPIES \uf0c1 PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .","title":"Home"},{"location":"#unidata-awips-user-manual","text":"https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the Unidata Program Center (UPC) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. Unidata supports two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and python-awips (a python package). Note : Our version of CAVE is a non-operational version. It does not support some features of NWS AWIPS. Warnings and alerts cannot be issued from Unidata's CAVE. Additional functionality may not be available as well.","title":"Unidata AWIPS User Manual"},{"location":"#download-and-install-cave","text":"","title":"Download and Install CAVE"},{"location":"#download-and-install-edex","text":"","title":"Download and Install EDEX"},{"location":"#work-with-python-awips","text":"","title":"Work with Python-AWIPS"},{"location":"#license","text":"Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS license information can be found here .","title":"License"},{"location":"#awips-data-in-the-cloud","text":"Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after).","title":"AWIPS Data in the Cloud"},{"location":"#distributed-computing","text":"AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, Unidata modified the package to be more applicable in the University setting. Because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX","title":"Distributed Computing"},{"location":"#software-components","text":"EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES","title":"Software Components"},{"location":"#edex","text":"The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX","title":"EDEX"},{"location":"#cave","text":"Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE","title":"CAVE"},{"location":"#ldm","text":"https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop","title":"LDM"},{"location":"#edexbridge","text":"edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing.","title":"edexBridge"},{"location":"#qpid","text":"http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd","title":"Qpid"},{"location":"#postgresql","text":"http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres","title":"PostgreSQL"},{"location":"#hdf5","text":"http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ .","title":"HDF5"},{"location":"#pypies","text":"PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .","title":"PyPIES"},{"location":"appendix/appendix-acronyms/","text":"A \uf0c1 ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System B \uf0c1 BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data C \uf0c1 CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity D \uf0c1 D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array E \uf0c1 ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory F \uf0c1 FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory G \uf0c1 GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division H \uf0c1 HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup I \uf0c1 ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T J \uf0c1 JMS - Java Messaging System K \uf0c1 KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed). L \uf0c1 LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report M \uf0c1 MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level N \uf0c1 NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service O \uf0c1 OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request P \uf0c1 PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency Q \uf0c1 QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary R \uf0c1 RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP) S \uf0c1 SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability T \uf0c1 TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts U \uf0c1 UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time V \uf0c1 VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile W \uf0c1 W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor Z \uf0c1 Z - Reflectivity ZDR - Differential Reflectivity","title":"Acronyms and Abbreviations"},{"location":"appendix/appendix-acronyms/#a","text":"ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System","title":"A"},{"location":"appendix/appendix-acronyms/#b","text":"BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data","title":"B"},{"location":"appendix/appendix-acronyms/#c","text":"CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity","title":"C"},{"location":"appendix/appendix-acronyms/#d","text":"D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array","title":"D"},{"location":"appendix/appendix-acronyms/#e","text":"ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory","title":"E"},{"location":"appendix/appendix-acronyms/#f","text":"FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory","title":"F"},{"location":"appendix/appendix-acronyms/#g","text":"GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division","title":"G"},{"location":"appendix/appendix-acronyms/#h","text":"HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup","title":"H"},{"location":"appendix/appendix-acronyms/#i","text":"ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T","title":"I"},{"location":"appendix/appendix-acronyms/#j","text":"JMS - Java Messaging System","title":"J"},{"location":"appendix/appendix-acronyms/#k","text":"KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed).","title":"K"},{"location":"appendix/appendix-acronyms/#l","text":"LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report","title":"L"},{"location":"appendix/appendix-acronyms/#m","text":"MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level","title":"M"},{"location":"appendix/appendix-acronyms/#n","text":"NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service","title":"N"},{"location":"appendix/appendix-acronyms/#o","text":"OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request","title":"O"},{"location":"appendix/appendix-acronyms/#p","text":"PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency","title":"P"},{"location":"appendix/appendix-acronyms/#q","text":"QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary","title":"Q"},{"location":"appendix/appendix-acronyms/#r","text":"RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP)","title":"R"},{"location":"appendix/appendix-acronyms/#s","text":"SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability","title":"S"},{"location":"appendix/appendix-acronyms/#t","text":"TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts","title":"T"},{"location":"appendix/appendix-acronyms/#u","text":"UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time","title":"U"},{"location":"appendix/appendix-acronyms/#v","text":"VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile","title":"V"},{"location":"appendix/appendix-acronyms/#w","text":"W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor","title":"W"},{"location":"appendix/appendix-acronyms/#z","text":"Z - Reflectivity ZDR - Differential Reflectivity","title":"Z"},{"location":"appendix/appendix-cots/","text":"Python for AWIPS \uf0c1 Component Version Description Python 2.7.13 Dynamic programming language python-awips 18.1.7 Python AWIPS Data Access Framework Cycler 0.10.0 Python library for composable style cycles Cython 0.28.3 Superset of the Python programming language, designed to give C-like performance with code that is mostly written in Python dateutil 2.7.3 Python extension to the standard datetime module NumPy 1.9.3 Numerical Python Scientific package for Python matplotlib 1.5.3 Python 2D Plotting Library Jep 3.7.1 3.8.2 Java Python interface h5py 1.3.0 HDF5 for Python PyDev 5.4.0 Python Development Environment PyParsing 2.2.0 Python class library for the easy construction of recursive-descent parsers Python QPID 1.36.0 Python API for Qpid Messaging PyTables 3.4.2 Python package for managing hierarchical datasets pytz 2015.4 World Timezone Definitions for Python Setuptools 28.6.0 Tools to download, build, install, upgrade, and uninstall Python packages ScientificPython 2.8.1 Python library for common tasks in scientific computing Shapely 1.6.4 Python package for manipulation and analysis of planar geometric objects. Six 1.11.0 Python 2 and 3 Compatibility Library stomp.py 4.1.20 Python client library for accessing messaging servers werkzeug 0.14.1 Python WSGI utility library YAJSW 12.09 Yet Another Java Service Wrapper Apache for AWIPS \uf0c1 Component Version Description ActiveMQ 5.14.2 JMS ActiveMQ Geronimo 1.1.1 Apache Batik 1.9 Batik is a Java-based toolkit for applications or applets that want to use images in the Scalable Vector Graphics (SVG) format for various purposes, such as display, generation or manipulation. Apache Camel 2.18.3 Enterprise Service Bus Apache Derby 10.12.1 Apache HTTP 4.3.6 Client and Core Apache HTTP Server 2.4.27 Apr 1.6.2 Apache Portable Runtime Project Apr-Util 1.6.0 Apache Portable Runtime Project commons-beanutils 1.9.3 Apache Common Libraries commons-codec 1.10 Apache Common Libraries commons-collections 3.2.2 Apache Common Libraries commons-configuration 1.10 Apache Common Libraries commons-compress 1.10 Apache Common Libraries commons-cli 1.2 Apache Common Libraries commons-digester 1.8.1 Apache Common Libraries commons-io 2.4 Apache Common Libraries commons-cxf 3.1.14 Apache Common Libraries commons-lang 2.6 Apache Common Libraries commons-lang3 3.4 Apache Common Libraries commons-management 1.0 Apache Common Libraries commons-net 3.3 Apache Common Libraries commons-pool 1.6 Apache Common Libraries commons-pool2 2.4.2 Apache Common Libraries commons-ssl Apache Common Libraries commons-validator 1.2.0 Apache Common Libraries Mime4J 0.7 Parser for e-mail message streams in plain rfc822 and MIME format MINA 1.1.7 Network application framework Qpid 6.1.4 Open Source AMQP (Advanced Message Queuing Protocol) Messaging Shiro 1.3.2 Java security framework Thrift 0.10.0 Binary Serialization Framework Velocity 1.7 Templating Engine WSS4J 2.1.4 Web Services Security Xalan 2.7.2 Xerces 2.9.1 XML Resolver 1.2 XML Security 2.0.6 XML Serializer 2.7.1 XML Beans 2.6.0 XML Graphics 2.2 XML Schema 2.1.0 Other COTS and FOSS \uf0c1 Component Version Description Ant 1.9.6 Java Build Tool Ant-Contrib 1.0b3 Additional useful tasks and types for Ant Antlr 2.7.6 Parser generator Atomikos TransactionEssentials 3.6.2 Transaction management system Bitstream Vera Fonts 1.10 Font library from Gnome Bouncy Castle jdk15on-1.54 Java implementation of cryptographic algorithms bzip2 0.9.1 Stream compression algorithm C3p0 0.9.1 c3p0 is an easy-to-use library for making traditional JDBC drivers \"enterprise-ready\" by augmenting them with functionality defined by the jdbc3 spec and the optional extensions to jdbc2. cglib 2.1 Byte Code Generation Library is high level API to generate and transform JAVA byte code. distcache 1.4.5-21 Distributed session caching dom4j 1.6.1 An open source library for working with XML, XPath, and XSLT on the Java platform using the Java Collections Framework OpenDAP 2 1.0.3 dwr (direct web remoting) Getahead 1.1.3 Java open source library Eclipse 4.6.1 Java IDE Eclipse Jetty 9.2.19 Servlet Engine and Http Server ehcache 1.3.0 Caching Support FITS Flexible Image Transport System GDAL 2.2.4 GEOS 3.6.2 Geometry Engine, Required for PostGIS GeoTools Java API 16.4 Java API for Manipulation of Geospatial Data GRIBJava 8.0 Grib Java Decoder Groovy 2.4.10 Guava 18.0 Google core libraries for Java Hamcrest 1.3 Java Hamcrest Matchers hdf5 1.8.4-patch1 Core HDF5 APIs hdf5 2.5 Core HDF5 APIs Hibernate 4.2.15 Data Access Layer HIbernate JPA 2.0 API 1.0.1 Hibernate API Istack 2.21 Common Utility Code Runtime IzPack 4.2.0 Installer creator for EDEX Jackson Databind 2.6.5 General data-binding functionality for Jackson JAI 1.1.3 Java API for Image Manipulation JAI \u2013 Image I/O 1.1 Plug-ins for JAI Jasper 1.900.1 JPEG-2000 codec Jasypt 1.9.2 Java simplified encryption Java jdk-8u101 Kit for both 32-bit and 64-bit Javax Servlet API 3.1.0 Jaxen 1.1.4 Open source X-Path Library Javassist 3.18.1 Java Programming Assistant for bytecode manipulation JCommander 1.72 Java framework for parsing command line parameters Jdom 1.1.3 Jdom2 2.0.6 jfreechart 1.0.19 JNA 4.1.0 Joda 2.9.9 Java date and time API jogamp 2.3.2 Provides hardware-supported 3D graphics JSR-275 1.0 beta Measures and Units JUnit 4.12 JTS Topology Suite 1.10 Java API for 2D spatial data lapack 3.4.2 Linear Algebra Package for python ldm 6.13.6 Local Data Manager Log4J 1.2.16 Logging Component used by Commons Logging Logback 1.2.0 libgfortran 4.1.2 Fortran Library Mchange Commons Java 0.2.3.4 Mchange c3p0 0.9.2.1 JDBC3 Connection and Statement Pooling Mockito 1.9.0 Mocking framework for unit tests written in Java mod_wsgi 3.5 Apache HTTP Server module that provides a WSGI compliant interface for hosting Python based web applications. Mozilla Rhino 1.6R7 Implementation of JavaScript embedded in Java NCAR NC2 Libraries 4.6.10 ucar.nc2 containing bufr, cdm, grib, httpservices, and udunits NCEP Grib2 Libraries Libraries for decoding & encoding data in GRIB2 format cnvgrib 1.1.8 and 11.9 Fortran GRIB1 <--> GRIB2 conversion utility g2clib 1.1.8 \"C\" grib2 encoder/decoder g2lib 1.1.8 and 1.1.9 Fortran grib2 encoder/decoder and search/indexing routines w3lib 1.6 and 1.7.1 Fortran grib1 encoder/decoder and utilities ObjectWeb ASM 2.2 ASM is an all-purpose Java bytecode manipulation and analysis framework. It can be used to modify existing classes or dynamically generate classes, directly in binary form ObjectWeb ASM OGC Tools GML JTS Converter 1.0.2 Opengis 1.0.2 OpenSAML 3.1.1 Portable implementation of the Security Assertion Markup Language (SAML) org.w3.xml.ext 1.3.04 Apache-hosted set of DOM, SAX, and JAXP interfaces OWASP Enterprise Security API 2.0.1 Open source web application security control library for programmers to write low-risk applications PNGJ 2.1.1 Java library for PNG image IO PostGIS 2.4.4 Geographic Object Support for PostgreSQL PostgreSQL 9.5.13 Database Proj 5.1.0 Cartographic Projections library Protocol Buffers 3.3.1 Core Protocol Buffers library Python megawidgets 1.3.2 Toolkit for building high-level compound widgets in Python using the Tkinter module Quartz 1.8.6 Enterprise Job Scheduler Reflections 0.9.9 Java runtime metadata analysis slf4j 1.7.21 The Simple Logging Facade for Java or (SLF4J) serves as a simple facade or abstraction for various logging frameworks smack 4.1.9 Open Source XMPP (Jabber) client library Spring Framework OSGI 1.2.0 dynamic modules Spring Framework 4.2.9 Layered Java/J2EE application platform Subclipse 1.4.8 Eclipse plugin for Subversion support SWT Add-ons 0.1.1 Add-ons for Eclipse SWT widgets Symphony OGNL 2.7.3 Object-Graph Navigation Language; an expression language for getting/setting properties of Java objects. SZIP 2.1 Compression in HDF Products. Tomcat Native 1.1.17 Library for native memory control UDUNITS 4.6.10 C library provides for arithmetic manipulation of units utilconcurrent 1.3.2 Utility classes Wildfire 3.1.1 Collaboration Server xmltask 1.15.1 Facility for automatically editing XML files as part of an Ant build Vecmath 1.3.1","title":"Appendix cots"},{"location":"appendix/appendix-cots/#python-for-awips","text":"Component Version Description Python 2.7.13 Dynamic programming language python-awips 18.1.7 Python AWIPS Data Access Framework Cycler 0.10.0 Python library for composable style cycles Cython 0.28.3 Superset of the Python programming language, designed to give C-like performance with code that is mostly written in Python dateutil 2.7.3 Python extension to the standard datetime module NumPy 1.9.3 Numerical Python Scientific package for Python matplotlib 1.5.3 Python 2D Plotting Library Jep 3.7.1 3.8.2 Java Python interface h5py 1.3.0 HDF5 for Python PyDev 5.4.0 Python Development Environment PyParsing 2.2.0 Python class library for the easy construction of recursive-descent parsers Python QPID 1.36.0 Python API for Qpid Messaging PyTables 3.4.2 Python package for managing hierarchical datasets pytz 2015.4 World Timezone Definitions for Python Setuptools 28.6.0 Tools to download, build, install, upgrade, and uninstall Python packages ScientificPython 2.8.1 Python library for common tasks in scientific computing Shapely 1.6.4 Python package for manipulation and analysis of planar geometric objects. Six 1.11.0 Python 2 and 3 Compatibility Library stomp.py 4.1.20 Python client library for accessing messaging servers werkzeug 0.14.1 Python WSGI utility library YAJSW 12.09 Yet Another Java Service Wrapper","title":"Python for AWIPS"},{"location":"appendix/appendix-cots/#apache-for-awips","text":"Component Version Description ActiveMQ 5.14.2 JMS ActiveMQ Geronimo 1.1.1 Apache Batik 1.9 Batik is a Java-based toolkit for applications or applets that want to use images in the Scalable Vector Graphics (SVG) format for various purposes, such as display, generation or manipulation. Apache Camel 2.18.3 Enterprise Service Bus Apache Derby 10.12.1 Apache HTTP 4.3.6 Client and Core Apache HTTP Server 2.4.27 Apr 1.6.2 Apache Portable Runtime Project Apr-Util 1.6.0 Apache Portable Runtime Project commons-beanutils 1.9.3 Apache Common Libraries commons-codec 1.10 Apache Common Libraries commons-collections 3.2.2 Apache Common Libraries commons-configuration 1.10 Apache Common Libraries commons-compress 1.10 Apache Common Libraries commons-cli 1.2 Apache Common Libraries commons-digester 1.8.1 Apache Common Libraries commons-io 2.4 Apache Common Libraries commons-cxf 3.1.14 Apache Common Libraries commons-lang 2.6 Apache Common Libraries commons-lang3 3.4 Apache Common Libraries commons-management 1.0 Apache Common Libraries commons-net 3.3 Apache Common Libraries commons-pool 1.6 Apache Common Libraries commons-pool2 2.4.2 Apache Common Libraries commons-ssl Apache Common Libraries commons-validator 1.2.0 Apache Common Libraries Mime4J 0.7 Parser for e-mail message streams in plain rfc822 and MIME format MINA 1.1.7 Network application framework Qpid 6.1.4 Open Source AMQP (Advanced Message Queuing Protocol) Messaging Shiro 1.3.2 Java security framework Thrift 0.10.0 Binary Serialization Framework Velocity 1.7 Templating Engine WSS4J 2.1.4 Web Services Security Xalan 2.7.2 Xerces 2.9.1 XML Resolver 1.2 XML Security 2.0.6 XML Serializer 2.7.1 XML Beans 2.6.0 XML Graphics 2.2 XML Schema 2.1.0","title":"Apache for AWIPS"},{"location":"appendix/appendix-cots/#other-cots-and-foss","text":"Component Version Description Ant 1.9.6 Java Build Tool Ant-Contrib 1.0b3 Additional useful tasks and types for Ant Antlr 2.7.6 Parser generator Atomikos TransactionEssentials 3.6.2 Transaction management system Bitstream Vera Fonts 1.10 Font library from Gnome Bouncy Castle jdk15on-1.54 Java implementation of cryptographic algorithms bzip2 0.9.1 Stream compression algorithm C3p0 0.9.1 c3p0 is an easy-to-use library for making traditional JDBC drivers \"enterprise-ready\" by augmenting them with functionality defined by the jdbc3 spec and the optional extensions to jdbc2. cglib 2.1 Byte Code Generation Library is high level API to generate and transform JAVA byte code. distcache 1.4.5-21 Distributed session caching dom4j 1.6.1 An open source library for working with XML, XPath, and XSLT on the Java platform using the Java Collections Framework OpenDAP 2 1.0.3 dwr (direct web remoting) Getahead 1.1.3 Java open source library Eclipse 4.6.1 Java IDE Eclipse Jetty 9.2.19 Servlet Engine and Http Server ehcache 1.3.0 Caching Support FITS Flexible Image Transport System GDAL 2.2.4 GEOS 3.6.2 Geometry Engine, Required for PostGIS GeoTools Java API 16.4 Java API for Manipulation of Geospatial Data GRIBJava 8.0 Grib Java Decoder Groovy 2.4.10 Guava 18.0 Google core libraries for Java Hamcrest 1.3 Java Hamcrest Matchers hdf5 1.8.4-patch1 Core HDF5 APIs hdf5 2.5 Core HDF5 APIs Hibernate 4.2.15 Data Access Layer HIbernate JPA 2.0 API 1.0.1 Hibernate API Istack 2.21 Common Utility Code Runtime IzPack 4.2.0 Installer creator for EDEX Jackson Databind 2.6.5 General data-binding functionality for Jackson JAI 1.1.3 Java API for Image Manipulation JAI \u2013 Image I/O 1.1 Plug-ins for JAI Jasper 1.900.1 JPEG-2000 codec Jasypt 1.9.2 Java simplified encryption Java jdk-8u101 Kit for both 32-bit and 64-bit Javax Servlet API 3.1.0 Jaxen 1.1.4 Open source X-Path Library Javassist 3.18.1 Java Programming Assistant for bytecode manipulation JCommander 1.72 Java framework for parsing command line parameters Jdom 1.1.3 Jdom2 2.0.6 jfreechart 1.0.19 JNA 4.1.0 Joda 2.9.9 Java date and time API jogamp 2.3.2 Provides hardware-supported 3D graphics JSR-275 1.0 beta Measures and Units JUnit 4.12 JTS Topology Suite 1.10 Java API for 2D spatial data lapack 3.4.2 Linear Algebra Package for python ldm 6.13.6 Local Data Manager Log4J 1.2.16 Logging Component used by Commons Logging Logback 1.2.0 libgfortran 4.1.2 Fortran Library Mchange Commons Java 0.2.3.4 Mchange c3p0 0.9.2.1 JDBC3 Connection and Statement Pooling Mockito 1.9.0 Mocking framework for unit tests written in Java mod_wsgi 3.5 Apache HTTP Server module that provides a WSGI compliant interface for hosting Python based web applications. Mozilla Rhino 1.6R7 Implementation of JavaScript embedded in Java NCAR NC2 Libraries 4.6.10 ucar.nc2 containing bufr, cdm, grib, httpservices, and udunits NCEP Grib2 Libraries Libraries for decoding & encoding data in GRIB2 format cnvgrib 1.1.8 and 11.9 Fortran GRIB1 <--> GRIB2 conversion utility g2clib 1.1.8 \"C\" grib2 encoder/decoder g2lib 1.1.8 and 1.1.9 Fortran grib2 encoder/decoder and search/indexing routines w3lib 1.6 and 1.7.1 Fortran grib1 encoder/decoder and utilities ObjectWeb ASM 2.2 ASM is an all-purpose Java bytecode manipulation and analysis framework. It can be used to modify existing classes or dynamically generate classes, directly in binary form ObjectWeb ASM OGC Tools GML JTS Converter 1.0.2 Opengis 1.0.2 OpenSAML 3.1.1 Portable implementation of the Security Assertion Markup Language (SAML) org.w3.xml.ext 1.3.04 Apache-hosted set of DOM, SAX, and JAXP interfaces OWASP Enterprise Security API 2.0.1 Open source web application security control library for programmers to write low-risk applications PNGJ 2.1.1 Java library for PNG image IO PostGIS 2.4.4 Geographic Object Support for PostgreSQL PostgreSQL 9.5.13 Database Proj 5.1.0 Cartographic Projections library Protocol Buffers 3.3.1 Core Protocol Buffers library Python megawidgets 1.3.2 Toolkit for building high-level compound widgets in Python using the Tkinter module Quartz 1.8.6 Enterprise Job Scheduler Reflections 0.9.9 Java runtime metadata analysis slf4j 1.7.21 The Simple Logging Facade for Java or (SLF4J) serves as a simple facade or abstraction for various logging frameworks smack 4.1.9 Open Source XMPP (Jabber) client library Spring Framework OSGI 1.2.0 dynamic modules Spring Framework 4.2.9 Layered Java/J2EE application platform Subclipse 1.4.8 Eclipse plugin for Subversion support SWT Add-ons 0.1.1 Add-ons for Eclipse SWT widgets Symphony OGNL 2.7.3 Object-Graph Navigation Language; an expression language for getting/setting properties of Java objects. SZIP 2.1 Compression in HDF Products. Tomcat Native 1.1.17 Library for native memory control UDUNITS 4.6.10 C library provides for arithmetic manipulation of units utilconcurrent 1.3.2 Utility classes Wildfire 3.1.1 Collaboration Server xmltask 1.15.1 Facility for automatically editing XML files as part of an Ant build Vecmath 1.3.1","title":"Other COTS and FOSS"},{"location":"appendix/appendix-grid-parameters/","text":"Abbreviation Description Units 0to5 t-5Day Mean Hgt m 2xTP6hr 12Hr Accum Precip from 2 6hr mm 36SHRMi S=Shear incr > 10kts 3-6km 50dbzZ 50dbz Hgt for 1 in. Svr Hail m accum_altimeter24 accum_altimeter24 Pa accum_dewpoint24 accum_dewpoint24 F accum_dpFromTenths24 accum_dpFromTenths24 accum_GH12 accum_GH12 m accum_htMan12 accum_htMan12 m accum_numMand12 accum_numMand12 accum_precip1Hour3 accum_precip1Hour3 in accum_precip1Hour6 accum_precip1Hour6 in accum_precip6Hour24 accum_precip6Hour24 in accum_prMan12 accum_prMan12 Pa accum_rawMETAR24 accum_rawMETAR24 accum_sfcPress3 accum_sfcPress3 Pa accum_temperature24 accum_temperatur24 in accum_tempFromTenths24 accum_tempFromTenths24 in accum_windDir24 accum_windDir24 in accum_windSpeed24 accum_windSpeed24 in ACOND Aerodynamic conductance m/s adimc Additional Impervious Area Water Content % ageoVC Ageo Vert Circ ageoW Ageo Wind m/s ageoWM Magnitude Ageo Wind m/s ALBDO Albedo % Along Component Along m/s Alt24Chg Alt24Chg Pa Alti Altimeter hPa ANCConvectiveOutlook ANC Convective Outlook ANCFinalForecast ANC Final Forecast dBZ ANCLayerCompositeReflectivity ANC Layer Composite Reflectivity dBZ AppT Apparent Temperature \u00b0F AV Absolute Vorticity /s AV Vorticity /s BARO Barometric Velocity Vectors m/s BASSW Spectrum Width kts BdEPT06 Max ThetaE Difference (3-6km Min minus 0-3km Max) K BGRUN Baseflow-Groundwater Runoff kg/m^2 BLI Best (4 layer) Lifted Index K BLI Best Lifted Index K BlkMag Bulk Shear Magnitude m/s BlkShr Bulk Shear Vectors m/s BMIXL Blackadar's Mixing Length Scale m BREFMaxHourly Hourly Base Reflectivity Maximum dBZ BrightBandBottomHeight Bright Band Bottom Height m BrightBandTopHeight Bright Band Top Height m BRN Net Bulk Richardson Number BRNEHIi 72% Supercell Cases Tornadic BRNmag m/s BRNSHR BRN Shear BRNvec m/s BRTMP Brightness Temperature K CAPE Convective Available Potential Energy J/kg CAPEc1 Prob CAPE > 500 J/kg % CAPEc2 Prob CAPE > 1000 J/kg % CAPEc3 Prob CAPE > 2000 J/kg % CAPEc4 Prob CAPE > 3000 J/kg % CAPEc5 Prob CAPE > 4000 J/kg % CapeStk Cape Stack capeToLvl cape up to level CAT Clear Air Turbulence % cCape Computed CAPE J/kg cCin Computed CIN J/kg CCOND Canopy Conductance m/s CCP Cloud Cover % CCPerranl Cloud Cover Analysis Uncertainty % CD Drag Coefficient Numeric CDCON Convective Cloud Cover % CDUVB Clear sky UV-B Downward Solar Flux W/m^2 CEIL Ceiling m CFRZR Categorical Freezing Rain CFRZR Categorical Freezing Rain bit CFRZRc1 Chc of Measurable FZRA (Dominant) % CFRZRmean Categorical Freezing Precip mean CFRZRsprd Categorical Freezing Precip sprd CIce Cloud Ice g/m^3 CICE Cloud Ice kg/m^2 CICEP Categorical Ice Pellets CICEP Categorical Ice Pellets bit CICEPc1 Chc of Measurable IP (Dominant) % CICEPmean Categorical Ice Pellets mean CICEPsprd Categorical Ice Pellets sprd Cig Ceiling Height Cigc1 Prob Ceiling Hgt < 500 ft % Cigc2 Prob Ceiling Hgt < 1000 ft % Cigc3 Prob Ceiling Hgt < 3000 ft % CIn Convective Inhibition J/kg ClCond Cloud Condensate g/m^3 CLGTN Categorical Lightning Potential CLGTN2hr 2hr Categorical Lightning Potential climoPW PW % of normal % climoPWimp Import NARR PW in CloudCover Cloud Cover K CLWMR Cloud Mixing Ratio kg/kg CnvP2hr 2hr Convective probability % CnvPcat Categorical convective potential CNWAT Plant Canopy Surface Water mm COCO Correlation Coefficient CompositeReflectivityMaxHourly Hourly Composite Reflectivity Maximum dBZ CONUSMergedReflectivity CONUS Merged Reflectivity dBZ CONUSMergedRHV CONUS Merged RhoHV CONUSMergedZDR CONUS Merged ZDR dB CONUSPlusMergedReflectivity CONUS-Plus Merged Reflectivity dBZ CONVP Categorical Convection Potential CONVP2hr 2hr Convection potential Corf Corfidi Vectors m/s CorfF Corfidi Vectors-Forward Prop kn CorfFM Corfidi Vec-Forward Mag kn CorfM Corfidi Vec Mag kn covCat Coverage Category % CP Conv Precip mm CP Convective Precipitation mm CP12hr Convective Precipitation(12 hours) mm CP1hr Convective Precipitation(1 hour) mm CP3hr Convective Precipitation(3 hours) mm CP6hr Convective Precipitation(6 hours) mm CP9hr Convective Precipitation(9 hours) mm CP-GFS Convective Precipitation for GFS mm CPOFP Percent of Frozen Precipitation % CPOFP Probability of Frozen precip % CPOFP Probability of Frozen Precip % CPOLP Probability of liquid precip % CPOP Categorical POP CPOZP Probability of Freezing Precip % CPOZP Probability of Freezing Precip % CPr Condensation Pressure hPa CPRAT Convective Precipitation Rate mm/s CPrD Condensation Pressure Deficit hPa CRAIN Categorical Rain CRAIN Categorical Rain bit CRAINc1 Chc of Measurable Rain (Dominant) % CRAINmean Categorical Rain mean CRAINsprd Categorical Rain sprd CritT1 Layer Min Temperature -6C, -10C K CSDLF Clear Sky Downward Long Wave Flux W/m^2 CSDSF Clear Sky Downward Solar Flux W/m^2 CSNOW Categorical Snow CSNOW Categorical Snow bit CSNOWc1 Chc of Measurable Snow (Dominant) % CSNOWmean Categorical Snow mean CSNOWsprd Categorical Snow sprd CSSI CO Svr Storm Idx CSULF Clear Sky Upward Long Wave Flux W/m^2 CSUSF Clear Sky Upward Solar Flux W/m^2 cTOT Cross Totals C CTSTM Categorical Tstorm CTyp Cloud Type CUEFI Convective Cloud Efficiency non-dim CumNrm Normalized Cumulative Shear /s CumShr Cumulative Shear m/s CURU Cu Rule 0>SKC,-1>SCT,-4 300J/Kg MLCape) HeliD Helicity (NCEP Delivered) m\u00b2/s\u00b2 HI Haines Index HI Haines Index Numeric HI1 Haines Stab Term HI3 HI1 Index Assign HI4 Moist Term Index Assign HIdx Heat Index K HIdx Heat Index K HighLayerCompositeReflectivity High Layer Composite Reflectivity (24-60 kft) dBZ HIWC HiWc K HPBL Height of Planetary Boundary Layer m HPBL Planetary Boundary Layer Height m HTSGW Total Significant Wave Height m HyC Hydrometer Conc g/m^3 ICAHT ICAO Standard Atmosphere Reference Height m ICEC Derived Radar Composite Proportion ICEC Ice Cover ICEC Ice Cover Proportion ICEG Ice growth rate m/s ICETK Ice Thickness m ICI Icing Severity Index ICIP Icing Probability % ICMR Ice Water Mixing Ratio ICNG Icing Potential % ICPRB Icing Probability % ICSEV Icing Severity Index ICSEV Icing severity non-dim ILW Int Liquid Water g/m^2 Into Component Into m/s INV Height of MaxTw above FrzLvl ft IP Icing Pot IPLayer SFC Cold Lyr Probs Toward SLEET ft IRBand4 Infrared Imagery K JFWPRB9-20 Fire Wx: Prob Wind >= 17.5 kts and RH < 20% % KDP Specific Differential Phase deg/km KI K Index K KI K Index K L-I Computed LI \u2103 L3EchoTop Level III High Resolution Enhanced Echo Top Mosaic kft L3VIL Level III High Resolution VIL Mosaic kg/m^2 LAND Land Cover (0=sea, 1=land) Proportion LANDN Land-sea coverage (nearest neighbor) [land=1,sea=0] LAPR Lapse Rate K/m latitude Latitude \u00b0 LatLon Earth Location LCDC Low Cloud Cover % LgSP Large Scale Precipitation mm LgSP1hr Large Scale Precipitation(1 hour) mm LgSP3hr Large Scale Precipitation(3 hour) mm LHF Latent Heat Flux W/m^2 LightningDensity15min CG Lightning Density (15 min.) Flashes/km^2/min LightningDensity1min CG Lightning Density (1 min.) Flashes/km^2/min LightningDensity30min CG Lightning Density (30 min.) Flashes/km^2/min LightningDensity5min CG Lightning Density (5 min.) Flashes/km^2/min LightningJumpGrid Lightning Jump LightningJumpGridMax5min Lightning Jump Max LightningProbabilityNext30min CG Lightning Probability (0-30 min.) % LightningProbabilityNext60min CG Lightning Probability (0-60 min.) % LIsfc2x Lifted Index Sfc to \u2103 LLCompositeReflectivity Low-Level Composite Reflectivity dBZ LLWSWind LLWSWind kts LM5 Bunkers Left-Moving Supercell m/s LM6 Elevated Left-Moving Supercell m/s loCape CAPE to 3kmAGL (Tv) J/kg longitude Longitude \u00b0 LowLayerCompositeReflectivity Low Layer Composite Reflectivity (0-24 kft) dBZ LSOIL Liquid soil moisture content (non-frozen) kg/m^2 lsrSample LSR Sample LtgP2hr 2hr Lightning probability % LtgPcat Categorical lightning potential LTNG Lightning non-dim LTNG Max 1hr Lightning Threat (flashes/km^2) LWHR Long-Wave Radiative Heating Rate K/s lzfpc Lower Zone Primary Free Water Content % lzfsc Lower Zone Secondary Free Water Content % lztwc Lower Zone Tension Water Content % MAdv Moisture Adv (g/kg)/s maritimeObscuredSkyIFR ft maritimeObscuredSkyLIFR ft maritimeObscuredSkyMVFR ft maritimeObscuredSkySymIFR maritimeObscuredSkySymLIFR maritimeObscuredSkySymMVFR maritimeObscuredSkySymVFR maritimeObscuredSkyVFR ft maritimeWind20T34 kn maritimeWind34T48 kn maritimeWind48T64 kn maritimeWind64P kn maritimeWindDir20T34 deg maritimeWindDir34T48 deg maritimeWindDir48T64 deg maritimeWindDir64P deg maritimeWindDirLow deg maritimeWindGust20T34 kn maritimeWindGust34T48 kn maritimeWindGust48T64 kn maritimeWindGust64P kn maritimeWindGustLow kn maritimeWindLow kn MaxDVV Max 1hr Downdraft Vertical Velocity m/s maxEPT Max ThetaE (0-3kmAgl) K MaxGRPL1hr Max Hourly Graupel kg/m^2 MaxREF1hr Max Hourly Reflectivity dBZ MAXRH Maximum Relative Humidity % MAXRH12hr 12-hour Maximum Rel Humidity % MAXRH3hr 3-hour Maximum Rel Humidity % MAXUPHL Max 1hr Updraft Helicity m^2/s^2 MAXUPHL Max Updraft Helicity m^2/s^2 MaxUPHL1hr Max Hourly Updft Helicity m^2/s^2 MaxUVV Max 1hr Updraft Vertical Velocity m/s MAXUW U Component of Hourly Maximum Wind Speed m/s MAXVW V Component of Hourly Maximum Wind Speed m/s MaxWGS1hr Max Hourly Wind Gust m/s MaxWHRRR Maximum 1hr Wind Gust m/s MaxWind1hr MaxWind1hr m/s MCDC Medium Cloud Cover % MCon Moisture Flux Div (g/kg)/s MCon2 Moisture Flux Div (Conv only) (g/kg)/s MCONV Horizontal Moisture Convergence kg/kg*s^m^2/s MergedAzShear02kmAGL Low-Level Azimuthal Shear (0-2km AGL) 1/s MergedAzShear36kmAGL Mid-Level Azimuthal Shear (3-6km AGL) 1/s MergedBaseReflectivity Raw Merged Base Reflectivity dBZ MergedBaseReflectivityQC Merged Base Reflectivity dBZ MergedReflectivityAtLowestAltitude Merged Reflectivity At Lowest Altitude (RALA) dBZ MergedReflectivityComposite Raw Composite Reflectivity Mosaic dBZ MergedReflectivityQCComposite Composite Reflectivity dBZ MergedReflectivityQComposite Composite Reflectivity Mosaic dBZ MESH Maximum Estimated Size of Hail (MESH) mm MESHTrack120min MESH Tracks (120 min. accum.) mm MESHTrack1440min MESH Tracks (1440 min. accum.) mm MESHTrack240min MESH Tracks (240 min. accum.) mm MESHTrack30min MESH Tracks (30 min. accum.) mm MESHTrack360min MESH Tracks (360 min. accum.) mm MESHTrack60min MESH Tracks (60 min. accum.) mm minEPT Min ThetaE (3-6kmAgl) K MINRH Minimum Relative Humidity % MINRH12hr 12-hour Minimum Rel Humidity % MINRH3hr 3-hour Minimum Rel Humidity % Mix1 850-1000 mx thk Mix2 Thickness: Wintery MIX MIXR Humidity Mixing Ratio kg/kg mixRat Mixing Ratio g/kg MLLCL ML LCL Height m Mmag Moisture Trans Mag g\u00b7m/(kg\u00b7s) MMP MCS Maintenance Probability % MMSP MSLP (MAPS Reduction) Pa MnT Minimum Temperature K MnT Minimum Temperature K MnT12hr 12-hr Minimum Temperature K MnT3hr 3-hr Minimum Temperature K MnT6hr 6-hr Minimum Temperature K MnT_avg Min Temp Ensemble Mean K MnT_perts Min Temp Perturbations K MnT_std Min Temp Ensemble Std Dev K ModelHeight0C Freezing Level Height m ModelSurfaceTemperature Surface Temperature C ModelWetbulbTemperature Wet Bulb Temperature C MountainMapperQPE01H QPE - Mountain Mapper (1 hr. accum.) mm MountainMapperQPE03H QPE - Mountain Mapper (3 hr. accum.) mm MountainMapperQPE06H QPE - Mountain Mapper (6 hr. accum.) mm MountainMapperQPE12H QPE - Mountain Mapper (12 hr. accum.) mm MountainMapperQPE24H QPE - Mountain Mapper (24 hr. accum.) mm MountainMapperQPE48H QPE - Mountain Mapper (48 hr. accum.) mm MountainMapperQPE72H QPE - Mountain Mapper (72 hr. accum.) mm MpV Saturated Geo Pot Vort K/hPa/s MRETag Echo Tops m MRMSVIL Vertically Integrated Liquid (VIL) kg/m^2 MRMSVIL120min VIL Max (120 min.) kg/m^2 MRMSVIL1440min VIL Max (1440 min.) kg/m^2 MRMSVILDensity Vertically Integrated Liquid (VIL) Density g/m^3 MSFDi Isen Moisture Stability Flux Div (g*hPa*m)/(kg*K*s^2) MSFi Isentropic Moisture Stability Flux g\u00b7hPa\u00b7m/(kg\u00b7K\u00b7s) MSFmi Isen Moisture Stability Flux Mag g\u00b7hPa\u00b7m/(kg\u00b7K\u00b7s) MSG Mont Strm Func m MSG Montgomery Stream Function m^2/s^2 msl-P MSL Pressure hPa msl-P2 MSL Pressure (2) hPa msl-P_avg MSL Press Ensemble Mean hPa msl-P_perts MSL Press Perturbations hPa msl-P_std MSL Press Ensemble Std Dev hPa MSL1 MSL1 ft MSL2 MSL2 ft MSL3 MSL3 ft MSL4 MSL4 ft MSL5 MSL5 ft MSLSA Altimeter hPa MTV Moisture Trans Vecs g\u00b7m/(kg\u00b7s) muCape Most Unstable CAPE J/kg MultiSensorP1QPE01H QPE - Multi Sensor P1 (1 hr. accum.) mm MultiSensorP1QPE03H QPE - Multi Sensor P1 (3 hr. accum.) mm MultiSensorP1QPE06H QPE - Multi Sensor P1 (6 hr. accum.) mm MultiSensorP1QPE12H QPE - Multi Sensor P1 (12 hr. accum.) mm MultiSensorP1QPE24H QPE - Multi Sensor P1 (24 hr. accum.) mm MultiSensorP1QPE48H QPE - Multi Sensor P1 (48 hr. accum.) mm MultiSensorP1QPE72H QPE - Multi Sensor P1 (72 hr. accum.) mm MultiSensorP2QPE01H QPE - Multi Sensor P2 (1 hr. accum.) mm MultiSensorP2QPE03H QPE - Multi Sensor P2 (3 hr. accum.) mm MultiSensorP2QPE06H QPE - Multi Sensor P2 (6 hr. accum.) mm MultiSensorP2QPE12H QPE - Multi Sensor P2 (12 hr. accum.) mm MultiSensorP2QPE24H QPE - Multi Sensor P2 (24 hr. accum.) mm MultiSensorP2QPE48H QPE - Multi Sensor P2 (48 hr. accum.) mm MultiSensorP2QPE72H QPE - Multi Sensor P2 (72 hr. accum.) mm MXDVV Max Downdraft Vertical Velocity m/s MXREF Max 1hr CAPPI dB MXSALB Maximum Snow Albedo % MxT Maximum Temperature K MxT Maximum Temperature K MxT12hr 12-hr Maximum Temperature K MxT3hr 3-hr Maximum Temperature K MxT6hr 6-hr Maximum Temperature K MxT_avg Max Temp Ensemble Mean K MxT_perts Max Temp Perturbations K MxT_std Max Temp Ensemble Std Dev K MXUVV Max Updraft Vertical Velocity m/s NBDSF Near IR Beam Downward Solar Flux W/m^2 NBE Neg Buoy Energy J/kg NDDSF Near IR Diffuse Downward Solar Flux W/m^2 NetIO Net Isen Adiabatic Omega Pa/s NLAT Latitude (-90 to 90) deg NST Nonsupercell Tornado (>1 NST Threat) NST1 Nonsupercell Tornado (>1 NST Threat NST2 Nonsupercell Tornado (>1 NST Threat numLevels Number of Levels O3MR Ozone Mixing Ratio kg/kg obscuredSky2IFR ft obscuredSky2LIFR ft obscuredSky2MVFR ft obscuredSky2VFR ft obscuredSky3IFR ft obscuredSky3LIFR ft obscuredSky3MVFR ft obscuredSky3VFR ft obscuredSkyIFR ft obscuredSkyLIFR ft obscuredSkyMVFR ft obscuredSkySym2IFR obscuredSkySym2LIFR obscuredSkySym2MVFR obscuredSkySym2VFR obscuredSkySym3IFR obscuredSkySym3LIFR obscuredSkySym3MVFR obscuredSkySym3VFR obscuredSkySymIFR obscuredSkySymLIFR obscuredSkySymMVFR obscuredSkySymVFR obscuredSkyVFR ft obsWind30T50 kn obsWind50P kn obsWindDir30T50 deg obsWindDir50P deg obsWindDirLow deg obsWindGust30T50 kn obsWindGust50P kn obsWindGustLow kn obsWindLow kn obVis Obstruction to Vision OGRD Current Vectors m/s OmDiff mb between -15C Omega and MaxOmega hPa ONE One OTIM Observation Time OZCON Ozone Concentration ppb OZMAX1 Ozone Daily Max from 1-hour Average ppbV OZMAX8 Ozone Daily Max from 8-hour Average ppbV P Pressure hPa P Pressure Pa PAdv Pressure Adv hPa/s PBE Pos Buoy Energy J/kg PBLREG Planetary Boundary Layer Regime PEC Precipitation Potential Placement in PEC_TT24 24h Cumulative Precip Potential Placement in PERPW Primary Wave Mean Period s PERPW Primary Wave Period s Perranl Pressure Analysis Uncertainty Pa Perranl Pressure Error Analysis Pa PERSW Secondary wave mean period s PERSW Secondary Wave Mean Period s PEVAP Potential Evaporation mm PEVPR Potential Evaporation Rate W/m^2 PFrnt 2-D Frontogenesis/Mag Fn K/m/s PGrd Pressure Gradient hPa/m PGrd1 Pressure Gradient dPa/km PGrdM Pressure Grad Mag hPa/m PICE Pecipitating ice content g/m^3 PIVA Thermal Wind Vort Adv /s pkPwr Peak Power dB PLI Parcel Lifted Index (to 500 mb) K PLIxc1 Prob LI < 0 % PLIxc2 Prob LI < -2 % PLIxc3 Prob LI < -4 % PLIxc4 Prob LI < -6 % PLIxc5 Prob LI < -8 % PMSL Pressure Reduced to MSL Pa PMSLmean Mean Sea Level Pressure mean hPa PMSLsprd Mean Sea Level Pressure sprd hPa poesDif11u3_7uIR POES 11u-3.7u Satellite GenericPixel POP Probability of precip % POP12hr 12hr precip probability % POP3hr 3hr precip probability % POP6 POP 6hr % POP6hr 6hr precip probability % POP_001 Prob of .1in/6hr Precip % POP_002 Prob of .3in/6hr Precip % POP_003 Prob of .6in/6hr Precip % POP_004 Prob of 1in/6hr Precip % POP_005 Prob of 2in/6hr Precip % POP_006 Prob of .1in/12hr Precip % POP_007 Prob of .3in/12hr Precip % POP_008 Prob of .6in/12hr Precip % POP_009 Prob of 1in/12hr Precip % POP_010 Prob of 2in/12hr Precip % POP_011 Prob of .05in/6hr Precip % POP_012 Prob of .05in/12hr Precip % POP_013 Prob of 1in/24hr Precip % POP_014 Prob of 2in/24hr Precip % POP_015 Prob of 2in/36hr Precip % POP_016 Prob of 2in/48hr Precip % POROS Soil Porosity Proportion POSH Probability of Severe Hail (POSH) % PoT Potential Temp K PoT Potential Temperature K PoTA Pot Temp Adv K/s PPAM Prob Precip abv nrml % PPAN Prob Precip abv nrml % PPAS Prob Precip abv nrml % PPBM Prob Precip blw nrml % PPBN Prob Precip blw nrml % PPBS Prob Precip blw nrml % PPFFG Probability of excessive rain % PPI Precipitation Probability Index % PPI1hr Precipitation Probability Index(1 hour) % PPI6hr Precipitation Probability Index(6 hour) % PPNN Prob Precip near nrml % PR Precip Rate mm/s PR Precipitation Rate mm/s prCloudHgt prCLoud converted to Hgt m prCloudHgtHi prCloudHgt when in hi layer m prCloudHgtLow prCloudHgt when in low layer m prCloudHgtMid prCloudHgt when in mid layer m prcp12hr 12hr probability of 0.01 inch of precip % prcp3hr 3hr probability of 0.01 inch of precip % prcp6hr 6hr probability of 0.01 inch of precip % Precip24Hr Precip24Hr in Precip3Hr Precip3Hr in Precip6Hr Precip6Hr in PrecipRate Radar Precipitation Rate (SPR) mm/hr PrecipType Surface Precipitation Type (SPT) PRESA Pressure Anomaly Pa PresStk Obsolete, replace later presWeather Present Weather Prob34 Prob of Wind Speed > 34 knots m/s Prob50 Prob of Wind Speed > 50 knots m/s Prob64 Prob of Wind Speed > 64 knots m/s ProbDpT50 Probability of Dewpoint temp > 50 degF % ProbDpT55 Probability of Dewpoint temp > 55 degF % ProbDpT60 Probability of Dewpoint temp > 60 degF % ProbDpT65 Probability of Dewpoint temp > 65 degF % ProbDpT70 Probability of Dewpoint temp > 70 degF % ProbVSS10p3Layer Prob Vertical Speed Shear > 20 kts % ProbVSS10p3Sfc Prob 0-2kft Shear > 20 kts % PROCON Probability of convection % PROCON2hr 2hr Convection probability % PROLGHT Lightning probability % PROLGHT2hr 2hr Lightning probability % PRP01H 1hr MRMS Radar-Only ARI year PRP03H 3hr MRMS Radar-Only ARI year PRP06H 6hr MRMS Radar-Only ARI year PRP12H 12hr MRMS Radar-Only ARI year PRP24H 24hr MRMS Radar-Only ARI year PRP30M 30min MRMS Radar-Only ARI year PRPMax Maximum MRMS Radar-Only ARI year PRSIGSV Total Probability of Extreme Severe Thunderstorms % PRSVR Total Probability of Severe Thunderstorms % Psfc Surface pressure hPa PT3 3 hr Pres Change hPa PTAM Prob Temp abv nrml % PTAN Prob Temp abv nrml % PTAS Prob Temp abv nrml % PTBM Prob Temp blw nrml % PTBN Prob Temp blw nrml % PTBS Prob Temp blw nrml % PTNN Prob Temp near nrml % Ptopo Surface pressure hPa PTOR Tornado Probability % PTvA Pot Vorticity Adv K/hPa/s*1.0E5 PTyp Precip Type PTypeRefIP Prob Precip Type is Refreezing Ice Pellets % pV Potential Vorticity K/hPa/s pVeq Equiv Pot Vort K/hPa/s PVORT Potential Vorticity m^2 kg^-1 s^-1 PVV Omega Pa/s PVV Vertical Velocity Pressure Pa/s PW Precipitable Water mm PW Preciptable H2O in PW2 Preciptable H2O >1.4 in. in PWmean Precipitable Water mean mm PWS34 Incremental Prob of wind speed >= 34 knots % PWS50 Incremental Prob of wind speed >= 50 knots % PWS64 Incremental Prob of wind speed >= 64 knots % PWsprd Precipitable Water sprd mm qDiv Div Q K/m^2/s*1.0E-12 QMAX Maximum specific humidity at 2m kg/kg QMIN Minimum specific humidity at 2m kg/kg qnVec Qn Vectors K/m^2/s QPECrestSoilMoisture QPE-CREST Soil Moisture % QPECrestStreamflow QPE-CREST Maximum Streamflow (m^3)*(s^-1) QPECrestUStreamflow QPE-CREST Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPEFFG01H 1hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFG03H 3hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFG06H 6hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFGMax Maximum MRMS Radar-Only QPE-to-FFG Ratio QPEHPStreamflow QPE-Hydrophobic Maximum Streamflow (m^3)*(s^-1) QPEHPUStreamflow QPE-Hydrophobic Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPESacSoilMoisture QPE-SAC-SMA Soil Moisture % QPESacStreamflow QPE-SAC-SMA Maximum Streamflow (m^3)*(s^-1) QPESacUStreamflow QPE-SAC-SMA Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPV1 QVec Conv K/m^2/s*1.0E-12 QPV2 Negative EPV* K/hPa/s QPV3 QPV Net QPV4 QG-EPV, RH>75% qsVec Qs Vectors K/m^2/s qVec Q Vectors K/m^2/s RadarAQI01H Radar Accumulation Quality Index 1 hour RadarAQI03H Radar Accumulation Quality Index 3 hour RadarAQI06H Radar Accumulation Quality Index 6 hour RadarAQI12H Radar Accumulation Quality Index 12 hour RadarAQI24H Radar Accumulation Quality Index 24 hour RadarAQI48H Radar Accumulation Quality Index 48 hour RadarAQI72H Radar Accumulation Quality Index 72 hour RadarOnlyQPE01H QPE - Radar Only (1 hr. accum.) mm RadarOnlyQPE03H QPE - Radar Only (3 hr. accum.) mm RadarOnlyQPE06H QPE - Radar Only (6 hr. accum.) mm RadarOnlyQPE12H QPE - Radar Only (12 hr. accum.) mm RadarOnlyQPE12Z QPE - Radar Only (Since 12Z accum.) mm RadarOnlyQPE15M QPE - Radar Only (15 min accum.) mm RadarOnlyQPE24H QPE - Radar Only (24 hr. accum.) mm RadarOnlyQPE48H QPE - Radar Only (48 hr. accum.) mm RadarOnlyQPE72H QPE - Radar Only (72 hr. accum.) mm RadarQualityIndex Radar Quality Index (RQI) RAIN Rain content g/m^3 Rain1 850-1000 ra thk Rain2 700-850 ra thk Rain3 Thickness: Rain Likely Raob Raob Interleaved Data rawMETAR24Chg rawMETAR24Chg \u2103 RCQ Humidity parameter in canopy conductance Proportion RCS Solar parameter in canopy conductance Proportion RCSOL Soil moisture parameter in canopy conductance Proportion Reflectivity0C Reflectivity at 0C dBZ ReflectivityAtLowestAltitude Reflectivity At Lowest Altitude (RALA) dBZ ReflectivityM10C Reflectivity at -10C dBZ ReflectivityM15C Reflectivity at -15C dBZ ReflectivityM20C Reflectivity at -20C dBZ ReflectivityM5C Reflectivity at -5C dBZ RETOP Echo Top m RH Rel Humidity % RH Relative Humidity % RH_001 Prob of RH Grtn 70 percent % RH_001_bin Binary Prob of RH Grtn 70 percent RH_001_perts Prob of RH Grtn 70 percent Perts RH_002 Prob of RH Grtn 90 percent % RH_002_bin Binary Prob of RH Grtn 90 percent RH_002_perts Prob of RH Grtn 90 percent Perts RH_avg Rel Humidity Ensemble Mean % RH_perts Rel Humidity Perturbations % RH_std Rel Humidity Ensemble Std Dev % RHmean Relative Humidity mean % RHsprd Relative Humidity spread % RIME Rime Factor non-dim RLYRS Number of Soil Layers in Root Zone Numeric RM5 Bunkers Right-Moving Supercell m/s RM6 Elevated Right-Moving Supercell m/s RMGH2 t-2Day Mean Hgt m RMprop Right Mover Propagation Vector RMprop2 Elevated Right Mover Propagation Vector rms root mean square kn Ro Rossby Number Vag/Vg RotationTrackLL120min Low-Level Rotation Tracks 0-2km AGL (120 min. accum.) 1/s RotationTrackLL1440min Low-Level Rotation Tracks 0-2km AGL (1440 min. accum.) 1/s RotationTrackLL240min Low-Level Rotation Tracks 0-2km AGL (240 min. accum.) 1/s RotationTrackLL30min Low-Level Rotation Tracks 0-2km AGL (30 min. accum.) 1/s RotationTrackLL360min Low-Level Rotation Tracks 0-2km AGL (360 min. accum.) 1/s RotationTrackLL60min Low-Level Rotation Tracks 0-2km AGL (60 min. accum.) 1/s RotationTrackML120min Mid-Level Rotation Tracks 3-6km AGL (120 min. accum.) 1/s RotationTrackML1440min Mid-Level Rotation Tracks 3-6km AGL (1440 min. accum.) 1/s RotationTrackML240min Mid-Level Rotation Tracks 3-6km AGL (240 min. accum.) 1/s RotationTrackML30min Mid-Level Rotation Tracks 3-6km AGL (30 min. accum.) 1/s RotationTrackML360min Mid-Level Rotation Tracks 3-6km AGL (360 min. accum.) 1/s RotationTrackML60min Mid-Level Rotation Tracks 3-6km AGL (60 min. accum.) 1/s routed_flow Channel Routed Flow [Low] routed_flow_c Channel Routed Flow [Combo] routed_flow_h Channel Routed Flow [Hi] routed_flow_m Channel Routed Flow [Mid] RR Reflectivity dBZ RRtype Radar w/PType dBZ RRV Radial Velocity kts RSMIN Minimal Stomatal Resistance s/m RV Rel Vorticity /s RWMR Rain Mixing Ratio kg/kg s2H2O_CLIMO Climatological -SON/DJF/MAM- Snow-to-water ratio s2H2O_GFS GFS Snow-to-water ratio s2H2O_MEAN HPC Mean Snow-to-water ratio s2H2O_NAM NAM Snow-to-water ratio SA12hr 12 Hr Snow Accum mm SA1hr 1 Hr Snow Accum mm SA24hr 24 Hr Snow Accum mm SA36hr 36 Hr Snow Accum mm SA3hr 3 Hr Snow Accum mm SA48hr 48 Hr Snow Accum mm SA6hr 6 Hr Snow Accum mm SAcc Snow Accum via Thickness mm SALIN Practical Salinity SALTY Salinity kg/kg SAmodel Model Run Snow via Thickness mm SArun Model Run Snow Accum via Thickness mm satCloudPhase Satellite Cloud Phase[8.5-11.2 um] K SATD Saturation Deficit Pa satDif11u12uIR 11u-12u Satellite GenericPixel satDif11u13uIR 11u-13u Satellite GenericPixel satDif11u3_9uIR 11u-3.9u Satellite GenericPixel satDivWVIR IR in WV Satellite DerivedWV satFog Satellite Fog[3.9-11.2 um] K satMoisture Satellite Moisture[11.2-12.3 um] K satSnow Satellite Snow[0.64-1.61 um] satUpperLevelInfo Satellite Upper Level Info[11.2-6.19 um] K satVegetation Satellite Vegetation[0.64-0.87 um] SBSNO Sublimation (evaporation from snow) W/m^2 SBT113 Simulated Brightness Temperature for GOES 11, Channel 3 K SBT114 Simulated Brightness Temperature for GOES 11, Channel 4 K SBT123 Simulated Brightness Temperature for GOES 12, Channel 3 K SBT124 Simulated Brightness Temperature for GOES 12, Channel 4 K sce NOHRSC Snow Coverage Elevation kft SCP Snow Cover SCP Snow Cover % SCWind SCWind m/s SDEN Snow Density kg/m\u00b3 SDENCLIMO Climatological -SON/DJF/MAM- Snow Density kg/m\u00b3 SDENGFS GFS Snow Density kg/m\u00b3 SDENMEAN HPC Mean Snow Density kg/m\u00b3 SDENNAM NAM Snow Density kg/m\u00b3 SeamlessHSR Seamless Hybrid Scan Reflectivity (SHSR) dBZ SeamlessHSRHeight Seamless Hybrid Scan Reflectivity (SHSR) Height km SFCR Surface Roughness m SH Spec Humidity SH Specific Humidity % Shear Shear (Vector) /s SHF Sensible Heat Flux W/m^2 SHI Severe Hail Index (SHI) ShrMag Shear Magnitude /s shWlt Showalter Index \u2103 SHx Spec Humidity g/kg SIGHAILPROB Significant Hail Probability % SIGTRNDPROB Significant Tornado Probability % SIGWINDPROB Significant Wind Probability % SIPD Supercooled Large Droplet Threat SLDP Supercooled Large Droplet Threat SLI Lifted Index K SLI Surface Lifted Index K SLTYP Surface Slope Type Index SMC Soil Moisture % SMDRY Direct Evaporation Cease (soil moisture) Proportion SMREF Transpiration Stress-onset (soil moisture) Proportion SnD Snow Depth m SnD Snow Depth m SNFALB Snow-Free Albedo SNMR Snow Mixing Ratio kg/kg SNOL12c1 Prob 12-hr SNOW > 1 in % SNOL12c10 Prob 12-hr SNOW > 24 in % SNOL12c2 Prob 12-hr SNOW > 2 in % SNOL12c3 Prob 12-hr SNOW > 4 in % SNOL12c4 Prob 12-hr SNOW > 6 in % SNOL12c5 Prob 12-hr SNOW > 7.5 in % SNOL12c6 Prob 12-hr SNOW > 8 in % SNOL12c7 Prob 12-hr SNOW > 10 in % SNOL12c8 Prob 12-hr SNOW > 12 in % SNOL12c9 Prob 12-hr SNOW > 16 in % SNOL12mean 12-hr Snowfall mean mm SNOL12sprd 12-hr Large scale Snowfall sprd mm SNOM Snow Melt kg/m^2 snoRat snoRatCrocus Snow Ratio - Crocus/ECMWF snoRatEMCSREF Snow Ratio: EMC SREF snoRatOv2 snoRatSPC Snow Ratio - SPC snoRatSPCdeep Snow Ratio - SPC 0-3km MaxT snoRatSPCsurface Snow Ratio - SPCsurface snoRatWPC Snow Ratio - WPC Mean SNOW Snow content g/m^3 Snow1 850-1000 sn thk Snow2 700-850 sn thk Snow3 Thickness: Snow Likely snowd3hr 3hr Snow Depth m snowd6hr 6hr Snow Depth m SNOWLVL Snow Level m SnowT Preferred Ice Growth K SNSQ Snow Sql Parameter SNW Sect Norm Wind m/s SNWA Ageo Sect Norm Wind kn SOILM Soil Moisture Content kg/m^2 SOILW Volumetric Soil Moisture Content Proportion SOTYP Soil Type SPAcc Storm Total Precip mm SPBARO Barotropic Velocity m/s SPC Current Speed m/s SPC Surface Current Speed m/s Spd24Chg Spd24Chg kn sRank Feature Strength Rank SRMl Storm Relative Flow Vectors LM m/s SRMlM Storm Relative Flow Mag LM m/s SRMm Storm Relative Flow Vecs (Mean Wind) m/s SRMmM Storm Relative Flow Mag (Mean Wind) m/s SRMr Storm Relative Flow Vecs (RM) m/s SRMrM Storm Relative Flow Mag (RM) m/s SSAcc Storm Total Snow mm SSi Isentropic Static Stability hPa/K SSP Significant Severe Parameter SSRUN Storm Surface Runoff kg/m^2 St-Pr Stable Precipitation mm St-Pr1hr 1 hr Stable Precipitation mm St-Pr2hr 2 hr Stable Precipitation mm St-Pr3hr 3 hr Stable Precipitation mm staName StaName stationId Station Id C stdDewpoint Std Dewpoint K stdMaxWindSpeed Std Max Wind Speed m/s stdSkyCover Std Sky Cover stdTemperature Std Temperature K stdWindDir Std Wind Direction stdWindSpeed Std Wind Speed m/s STP Sig. Tornado Parameter (>1 Sig Tor) STP1 Sig. Tornado Parameter (>1 Sig Tor) STRM Stream Function m^2/s StrmMot Storm Motion kn StrTP Strong Tornado Parameter m/s^2 SuCP Supercell Composite Parameter SUNSD Sunshine Duration s SuperLayerCompositeReflectivity Super Layer Composite Reflectivity (33-60 kft) dBZ SVV Sigma Coordinate Vertical Velocity /s SWDIR Direction of Swell Waves deg SWdir Swell Direction swe NOHRSC Snow Water Equivalent in SWELL Significant Height of Swell Waves m SWELL Swell Height m SWHR Solar Radiative Heating Rate K/s SWLEN Mean length of swell waves m SWPER Mean Period of Swell Waves s SWPER Swell Period s SWSTP Steepness of swell waves swtIdx Sweat Index SynPrecip24Hr SynPrecip24Hr mm SynthPrecipRateID QPE - Synthetic Precip Rate ID T Temperature K T Temperature K T24Chg T24Chg \u00b0F T24hr 24 hr Temperature K T_001 Prob of Temp Lstn 0C % T_001_bin Binary Prob of Temp Lstn 0C T_001_perts Prob of Temp Lstn 0C Perturbations T_avg Temperature Ensemble Mean K T_perts Temperature Perturbations K T_std Temperature Ensemble Std Dev K Ta Temperature Anomaly K TAdv Temperature Adv K/s Tc1 Prob Temp < O C % TCC Total Cloud Cover % TCCerranl Total Cloud Cover Error Analysis % TCICON Total Column-Integrated Condensate kg/m^2 TCLSW Total Column Integrated Supercooled Liquid Water kg/m^2 TCOLG Total Column Integrated Graupel kg/m^2 TCOLI Total Column-Integrated Cloud Ice kg/m^2 TCOLM Total Column Integrated Melting Ice kg/m^2 TCOLR Total Column Integrated Rain kg/m^2 TCOLS Total Column Integrated Snow kg/m^2 TCOLW Total Column-Integrated Cloud Water kg/m^2 TCOND Total Condensate kg/kg Tdef Total Deformation /s*100000.0 Tdend Dendritic Growth Temperatures K Terranl Temperature Analysis Uncertainty K Terranl Temperature Error Analysis K TGrd Temperature Gradient K/m TGrdM Temperature Grad Mag K/m ThetaE Theta E K ThGrd Temperature Gradient \u2103/m Thom5 S-R Flow Thom5a S-R Flow Thom6 S-R Flow Suggests Tor Supercells ThP Thunderstorm probability % ThP Thunderstorm Probability % ThP12hr 12hr Thunderstorm probability % ThP3hr 3hr Thunderstorm probability % ThP6hr 6hr Thunderstorm probability % ThPcat Categorical thunderstorm TiltAng Radar Tilt Angle deg TKE Turb Kin Energy J/kg TKE Turbulent Kinetic Energy J/kg Tmax Layer Max Temperature K TmDpD Temp minus Dewp Dep Tmean Temperature mean K Tmin Layer Min Temperature K Topo Topography m TORi BRNSHR,EHI,LRate>3C/km,CIN < 150 TORi2 BRNSHR,EHI,0-2km LRate > 3C/km TotQi Isentropic Total Moisture g\u00b7hPa/(kg\u00b7K) TOTSN 24hr Snowfall m TOTSN12hr 12hr Snowfall m TOZNE Total Ozone DU TP Precipitation mm TP Total Precipitation mm TP120hr 5 Day Total Gridded Precip in TP12c1 12-hr POP > 0.01 in % TP12c2 12-hr POP > 0.05 in % TP12c3 12-hr POP > 0.10 in % TP12c4 12-hr POP > 0.25 in % TP12c5 12-hr POP > 0.50 in % TP12c6 12-hr POP > 1.00 in % TP12c7 12-hr POP > 1.50 in % TP12c8 12-hr POP > 2.00 in % TP12hr 12 Hr Accum Precip mm TP12hr Total Precipitation(12 hours) mm TP12mean 12-hr Total Precip mean mm TP12sprd 12-hr Total Precip sprd mm TP168hr 7 Day Total Gridded Precip mm TP18hr Total Precipitation(18 hours) mm TP1hr 1 Hr Accum Precip mm TP1hr Total Precipitation(1 hour) mm TP24c1 24-hr POP > 0.01 in % TP24c2 24-hr POP > 0.05 in % TP24c3 24-hr POP > 0.10 in % TP24c4 24-hr POP > 0.25 in % TP24c5 24-hr POP > 0.50 in % TP24c6 24-hr POP > 1.00 in % TP24c7 24-hr POP > 1.50 in % TP24c8 24-hr POP > 2.00 in % TP24hr 24 Hr Accum Precip mm TP24hr Total Precipitation(24 hours) mm TP24hr_avg 24hr Precip Ensemble Mean mm TP24hr_perts 24hr Precip Perturbations mm TP24hr_std 24hr Precip Ensemble Std Dev mm TP24mean 24-hr Total Precip mean mm TP24sprd 24-hr Total Precip sprd mm TP36hr 36 Hr Accum Precip mm TP3c1 3-hr POP > 0.01 in % TP3c2 3-hr POP > 0.05 in % TP3c3 3-hr POP > 0.10 in % TP3c4 3-hr POP > 0.25 in % TP3c5 3-hr POP > 0.50 in % TP3c6 3-hr POP > 1.00 in % TP3c7 3-hr POP > 1.50 in % TP3c8 3-hr POP > 2.00 in % TP3hr 3 Hr Accum Precip mm TP3hr Total Precipitation(3 hours) mm TP3mean 3-hr Total Precip mean mm TP3sprd 3-hr Total Precip sprd mm TP48hr 48 Hr Accum Precip mm TP48hr Total Precipitation(48 hours) mm TP6c1 6-hr POP > 0.01 in % TP6c2 6-hr POP > 0.05 in % TP6c3 6-hr POP > 0.10 in % TP6c4 6-hr POP > 0.25 in % TP6c5 6-hr POP > 0.50 in % TP6c6 6-hr POP > 1.00 in % TP6c7 6-hr POP > 1.50 in % TP6c8 6-hr POP > 2.00 in % TP6hr 6 Hr Accum Precip mm TP6hr Total Precipitation(6 hours) mm TP6hr_avg 6hr Precip Ensemble Mean mm TP6hr_perts 6hr Precip Perturbations mm TP6hr_std 6hr Precip Ensemble Std Dev mm TP6mean 6-hr Total Precip mean mm TP6sprd 6-hr Total Precip sprd mm TP72hr 3 Day Total Gridded Precip mm TP9hr Total Precipitation(9 hours) mm TP_ACR ACR Precip in TP_ALR ALR Precip in TP_avg Precip Ensemble Mean mm TP_ECMWF ECMWF Precipitation in TP_ECMWF12hr ECMWF 12 Hr Accum Precip in TP_FWR FWR Precip in TP_HPC HPC Precip in TP_KRF KRF Precip in TP_MSR MSR Precip in TP_ORN ORN Precip in TP_perts Precip Perturbations mm TP_PTR PTR Precip in TP_RHA RHA Precip in TP_RSA RSA Precip in TP_std Precip Ensemble Std Dev mm TP_STR STR Precip in TP_TAR TAR Precip in TP_TIR TIR Precip in TP_TUA TUA Precip in TPFI Turbulence Index TPFI Turbulence Potential Forecast Index TP-GFS Total Precipitation for GFS mm tpHPC HPC Precip in tpHPCndfd Precipitation mm TPmodel Model Run Precip mm TPrun Run Accum Pcpn mm TPrun_avg Accum Precip Ensemble Mean mm TPrun_perts Accum Precip Perturbations mm TPrun_std Accum Precip Ensemble Std Dev mm TPx12x6 12-6 Hr Accum Precip mm TPx1x3 3x1 Hr Accum Precip mm TPx3 3 Hr Accum Precip mm TQIND TQ Index 12=Cold Pool 17=Embedded Convection C TRANS Transpiration W/m^2 transparentMaritimeSky ft transparentMaritimeSkySym ft transparentSky ft transparentSky2 ft transparentSky3 ft transparentSkySym ft transparentSkySym2 ft transparentSkySym3 ft TransWind TransWind kts TShrMi S=0-6km Shear Supports Scells TSLSA 3 hr Pres Change hPa TSNOW Total Snow kg/m^2 TSOIL Soil Temperature K Tsprd Temperature spread K TSRWE Total Snowfall Rate Water Equivalent kg/m^2/s Tstk Temp Stack K tTOT Total Totals C TURB Turbulence Index TV Virtual Temperature K TW Wet Bulb Temp K tWind Thermal Wind kn tWindU U Component of Thermal Wind kn tWindV V Component of Thermal Wind kn TwMax Layer Max Wet-bulb Temperature K TwMin Layer Min Wet-bulb Temperature K TWO Two Twstk Wet-bulb Temp Stack K TxSM Filtered-500km Temp C U-GWD Zonal Flux of Gravity Wave Stress N/m^2 UFLX Momentum Flux, U-Component N/m^2 uFX Geo Momentum m/s ulSnoRat ULWRF Comp Refl dBZ ULWRF Upward Long-Wave Rad. Flux W/m^2 UPHL Updraft Helicity m^2/s^2 USTM U-Component of Storm Motion m/s USWRF Reflectivity dBZ USWRF Upward Short-Wave Radiation Flux W/m^2 uv2 Horz Variance m^2/s^2 uW u Component of Wind m/s uW U-Component of Wind m/s uWerranl uWmean m/s uWsprd uWStk U Stack m/s uzfwc Upper Zone Free Water Content % uztwc Upper Zone Tension Water Content % V-GWD Meridional Flux of Gravity Wave Stress N/m^2 VAdv Vorticity Adv /s*1.0E9 VAdvAdvection Vorticity Adv /s VAPP Vapor Pressure Pa VBDSF Visible Beam Downward Solar Flux W/m^2 VEG Vegetation % vertCirc Vertical Circulation VFLX Momentum Flux, V-Component N/m^2 VGP Vort Gen Param VGTYP Vegetation Type Integer (0-13) VII Vertically Integrated Ice (VII) kg/m^2 VILIQ Vertically Integrated Liquid (VIL) kg/m^2 Vis Visibility m Vis Visibility m visbyIFR mi visbyLIFR mi visbyMVFR mi visbyVFR mi Visc1 Prob Sfc Visibility < 1 mile % Visc2 Prob Sfc Visibility < 3 miles % Visc23 Prob Sfc Visibility < 5 miles % visCat Categorical visibility Viserranl Visibility Analysis Uncertainty m Viserranl Visibility Error Analysis m Visible Visible Imagery VPT Virtual Potential Temperature K VRATE Ventilation Rate m^2/s vSmthW Verticall Smoothed Wind m/s VSS Vertical Shear Speed /s VSTM V-Component of Storm Motion m/s VTMP Virtual Temperature K vTOT Vertical Totals VUCSH Vertical u-component shear /s VV Vertical velocity m/s VVCSH Vertical v-component shear /s vW v Component of Wind m/s vW V-Component of Wind m/s vWerranl vWmean m/s vwpSample VWP Sample VWSH Vertical Speed Shear /s vWsprd vWStk V Stack m/s w2 Vert Variance m^2/s^2 WarmRainProbability Probability of Warm Rain % water_depth Hillslope Water Depth in WaterVapor Water Vapor Imagery K WATR Water Runoff kg/m^2 WCD Warm Cloud Depth Approx.: Frzlvl-LCL Thickness m WD Wind Direction (from which blowing) deg WD Wind direction deg WDea Wind Direction Analysis Uncertainity deg WDEPTH Geometric Depth Below Sea Surface m WDerranl Wind Direction Error Analysis deg wDiv Wind Divergence /s WDmean Wind Direction mean deg WEASD Water Equiv accum snow depth m WEASD Water Equivalent of Accumulated Snow Depth mm WGH 5-Wave Geopotential Height gpm WGH 5-wave geopotential height m WGS Wind Gust Speed m/s WGS Wind Gust Speed m/s WGS1hr Max 1-hr Wind Gust Speed m/s WGSea Wind Gust Speed Analysis Uncertainty m/s WGSerranl Wind Gust Speed Error Analysis m/s WGSMX1hr Max Hourly Wind Gust m/s WILT Wilting Point Proportion Wind Wind m/s Wind_avg Wind Ensemble Mean m/s Wind_perts Wind Perturbations m/s Windmean Mean Wind kn WINDPROB Wind Probability % WMIXE Wind Mixing Energy J WndChl Wind Chill K WS Wind Speed m/s WSc1 Prob SFC wind speed > 25 kt % WSc2 Prob SFC wind speed > 34 kt % WSc3 Prob SFC wind speed > 48 kt % WSc4 Prob SFC wind speed > 50 kt % WSc6 Prob SFC wind speed > 20 kt % WSc7 Prob SFC wind speed > 30 kt % WSc8 Prob SFC wind speed > 40 kt % WSerranl Wind Speed Error Analysis m/s WSmean Wind Speed mean m/s wSp Wind speed m/s wSp_001 Prob of Wind Grtn 40kts % wSp_001_bin Binary Prob of Wind Grtn 40kts wSp_001_perts Prob of Wind Grtn 40kts Perts wSp_002 Prob of Wind Grtn 50kts % wSp_002_bin Binary Prob of Wind Grtn 50kts wSp_002_perts Prob of Wind Grtn 50kts Perts wSp_003 Prob of Wind Grtn 60kts % wSp_003_bin Binary Prob of Wind Grtn 60kts wSp_003_perts Prob of Wind Grtn 60kts Perts wSp_004 Prob of Wind Grtn 30kts % wSp_004_bin Binary Prob of Wind Grtn 30kts wSp_004_perts Prob of Wind Grtn 30kts Perts wSp_avg Windspeed Ensemble Mean m/s wSp_perts Windspeed Perturbations m/s wSp_std Windspeed Ensemble Std Dev m/s wSpea Wind Speed Analysis Uncertainty kn wSpmean Mean Windspeed kt wSpsprd Windspread spread kt WSsprd Wind Speed sprd m/s WVDIR Direction of Wind Waves deg WVdir Wind Wave Direction wvHeight wvHeight m WVHGT Significant Height of Wind Waves m WVHGT Wind Wave Height m WVLEN Mean length of wind waves m WVPER Mean Period of Wind Waves s WVPER Wind Wave Period s wvPeriod wvPeriod WVSTP Steepness of wind waves wvType wvType wW w Component of Wind cm/s wx Weather zAGL Height AGL m ZDR Differential Reflectivity dB","title":"AWIPS Grid Parameters"},{"location":"appendix/appendix-wsr88d/","text":"Product Name Mnemonic ID Levels Res Elevation Reflectivity (Z) Z 19 16 100 .5 Reflectivity (Z) Z 19 16 100 1.5 Reflectivity (Z) Z 19 16 100 2.5 Reflectivity (Z) Z 19 16 100 3.5 Reflectivity (Z) Z 20 16 200 .5 Velocity (V) V 27 16 100 .5 Velocity (V) V 27 16 100 1.5 Velocity (V) V 27 16 100 2.5 Velocity (V) V 27 16 100 3.5 Storm Rel Velocity (SRM) SRM 56 16 100 .5 Storm Rel Velocity (SRM) SRM 56 16 100 1.5 Storm Rel Velocity (SRM) SRM 56 16 100 2.5 Storm Rel Velocity (SRM) SRM 56 16 100 3.5 Composite Ref (CZ) CZ 37 16 100 -1 Composite Ref (CZ) CZ 38 16 400 -1 Lyr Comp Ref Max (LRM) Level 1 LRM 65 8 0 -1 Lyr Comp Ref Max (LRM) Level 2 LRM 66 8 0 -1 Lyr Comp Ref Max (LRM) Level 3 LRM 90 8 0 -1 Lyr Comp Ref MAX (APR) APR 67 16 0 -1 Echo Tops (ET) ET 41 16 0 -1 Vert Integ Liq (VIL) VIL 57 16 0 -1 One Hour Precip (OHP) OHP 78 16 0 -1 Storm Total Precip (STP) STP 80 16 0 -1 VAD Wind Profile (VWP) VWP 48 0 0 -1 Digital Precip Array (DPA) DPA 81 256 400 -1 Velocity (V) V 25 16 100 .5 Base Spectrum Width (SW) SW 28 8 100 .5 Base Spectrum Width (SW) SW 30 8 100 .5 Severe Weather Probablilty (SWP) SWP 47 0 100 -1 Storm Tracking Information (STI) STI 58 0 100 -1 Hail Index (HI) HI 59 0 100 -1 Mesocyclone (M) M 60 0 100 -1 Mesocyclone (MD) MD 141 0 0 1 Tornadic Vortex Signature (TVS) TVS 61 0 100 -1 Storm Structure (SS) SS 62 0 100 -1 Supplemental Precipitation Data (SPD) SPD 82 0 100 -1 Reflectivity (Z) Z 94 256 100 .5 Reflectivity (Z) Z 94 256 100 1.5 Reflectivity (Z) Z 94 256 100 2.4 Reflectivity (Z) Z 94 256 100 3.4 Reflectivity (Z) Z 94 256 100 4.3 Reflectivity (Z) Z 94 256 100 5.3 Reflectivity (Z) Z 94 256 100 6.2 Reflectivity (Z) Z 94 256 100 7.5 Reflectivity (Z) Z 94 256 100 8.7 Reflectivity (Z) Z 94 256 100 10.0 Reflectivity (Z) Z 94 256 100 12.0 Reflectivity (Z) Z 94 256 100 14.0 Reflectivity (Z) Z 94 256 100 16.7 Reflectivity (Z) Z 94 256 100 19.5 Velocity (V) V 99 256 25 .5 Velocity (V) V 99 256 25 1.5 Velocity (V) V 99 256 25 2.4 Velocity (V) V 99 256 25 3.4 Velocity (V) V 99 256 25 4.3 Velocity (V) V 99 256 25 5.3 Velocity (V) V 99 256 25 6.2 Velocity (V) V 99 256 25 7.5 Velocity (V) V 99 256 25 8.7 Velocity (V) V 99 256 25 10.0 Velocity (V) V 99 256 25 12.0 Velocity (V) V 99 256 25 14.0 Velocity (V) V 99 256 25 16.7 Velocity (V) V 99 256 25 195 Super Res Reflectivity (Z) HZ 153 256 25 .5 Super Res Reflectivity (Z) HZ 153 256 25 1.5 Super Res Velocity (V) HV 154 256 25 .5 Super Res Velocity (V) HV 154 256 25 1.5 Super Res Spec Width (SW) HSW 155 256 25 .5 Super Res Spec Width (SW) HSW 155 256 25 1.5 Spectrum Width (SW) SW 30 8 100 1.5 Spectrum Width (SW) SW 28 8 25 1.5 Digital Vert Integ Liq (DVL) DVL 134 256 100 -1 Digital Hybrid Scan Refl (DHR) DHR 32 256 100 -1 Enhanced Echo Tops (EET) EET 135 256 100 -1 Digital Meso Detection (DMD) DMD 149 0 0 16384 TVS Rapid Update (TRU) TRU 143 0 0 16384 User Selectable Lyr Refl (ULR) ULR 137 16 100 -1 Storm Total Precip (STP) STP 138 256 200 -1 1-Hour Snow-Water Equiv (OSW) OSW 144 16 100 -1 1-Hour Snow Depth (OSD) OSD 145 16 100 -1 Storm Tot Snow Depth (SSD) SSD 147 16 100 -1 Storm Tot Snow-Water Equiv (SSW) SSW 146 16 100 -1 Differential Refl (ZDR) ZDR 158 16 100 .5 Differential Refl (ZDR) ZDR 159 256 25 16384 Correlation Coeff (CC) CC 160 16 100 .5 Correlation Coeff (CC) CC 161 256 25 16384 Specific Diff Phase (KDP) KDP 162 16 100 .5 Specific Diff Phase (KDP) KDP 163 256 25 16384 Hydrometeor Class (HC) HC 164 16 100 .5 Hydrometeor Class (HC) HC 165 256 25 16384 Melting Layer (ML) ML 166 0 0 16384 Hybrid Hydrometeor Class (HHC) HHC 177 256 25 -1 Digital Inst Precip Rate (DPR) DPR 176 0 25 -1 One Hour Accum (OHA) OHA 169 16 200 -1 User Select Accum (DUA) DUA 173 256 25 -1 User Select Accum (DUA) DUA 173 256 25 -1 Storm Total Accum (STA) STA 171 16 200 -1 Storm Total Accum (DSA) STA 172 256 25 -1 One Hour Diff (DOD) DOD 174 256 25 -1 Storm Total Diff (DSD) DSD 175 256 25 -1","title":"WSR-88D Product Table"},{"location":"appendix/common-problems/","text":"Common Problems \uf0c1 All Operating Systems \uf0c1 Removing caveData \uf0c1 Removing caveData (flushing the local cache) should be one of the first troubleshooting steps to take when experiencing weird behavior in CAVE. The cache lives in a folder called caveData , hence why this process is also referred to as removing or deleting caveData. Linux \uf0c1 For Linux users, the easiest way is to open a new terminal and run the following command: rm -rf ~/caveData Windows \uf0c1 For Windows users, simply delete the caveData folder in your home user directory: Mac \uf0c1 For Mac users, the easiest way is to open a new terminal and run the following command: rm -rf ~/Library/caveData Disappearing Configurations \uf0c1 If you ever notice some of the following settings you've configured/saved disappear from CAVE: Saved Displays or Procedures NSHARP settings (line thickness, etc) Colormap settings StyleRule settings This is not a fully exhaustive list, so if something else has disappeared it might be the same underlying issue still. Then it is likely we have recently changed our production EDEX server. There is a good chance we can recover your settings. To do so, please send a short email to support-awips@unidata.ucar.edu with the topic \"Missing Configurations\", and include the username(s) of the computer(s) you use to run CAVE. Remotely Connecting to CAVE \uf0c1 Since the pandemic began, many users have asked if they can use X11 forwarding or ssh tunneling to remotely connect to CAVE machines. This is not recommended or supported , and CAVE crashes in many different ways and expresses strange behavior as well. We highly recommend you download the appropriate CAVE installer on your local machine, if that is an option. If that is not an option, then the only remote access we recommend is using some type of VNC. RealVNC and nomachine are two options that are in use with positive outcomes. UltraVNC may be another option, but may have quite a delay. There may also be other free or paid software available that we are not aware of. It is likely that any VNC option you choose will also require some software or configuration to be set on the remote machine, and this will likely require administrative privileges. CAVE Spring Start Up Error \uf0c1 If you encounter the error below, please see one of our solution methods for resolving: CAVE's Spring container did not initialize correctly and CAVE must shut down. We have found the reason for this failure is because the host machine is set to use a language other than English (ie. Spanish, French, etc). To resolve this issue, either: Switch your system to English, when using CAVE or Use our Virtual Machine option . This option allows your actual machine to stay in whichever language you choose, while allowing you to run CAVE in an environment set to English. Although we list this installation under the Windows OS, this can also be done on Linux. The VM option has one notable drawback at the moment -- it cannot render RGB satellite products. Products Not Loading Properly \uf0c1 This problem is most commonly seen with the direct Windows installation. It can also manifest in the Mac installation (and is possible on Linux), and the root of the problem is not having Python installed properly for CAVE to use the packages. If the Windows installation was not completed properly, it is possible to see incorrect behavior when loading certain products. These are derived products which use the local machine to create and render the data. This creation is dependent upon python and its required packages working correctly. The dataset will be available in the menus and product browser, but when loaded, no data is drawn on the editor, but an entry is added to the legend. You may see an error that mentions the python package, jep . Known datasets this can affect (this is not a comprehensive list): Model Winds Metars Winds METAR Station Plot GFS Precip Type To correct this issue on Windows: Uninstall all related software (C++ Build Tools, Miniconda, Python, CAVE, pip, numpy, jep, etc) Redo all necessary installation instructions in steps 1 through 6 To correct this issue on Mac: Install the awips-python.pkg package found on step 1 To correct this issue on Linux: When running which python from a terminal, make sure /awips2/python/ is returned, if not, reset that environment variable, or re-run the awips_install.sh script from our installation instructions Windows \uf0c1 CAVE Map Display in Lower Left Quadrant - Windows \uf0c1 If you start up CAVE in Windows and notice the map is showing up only in the bottom left quadrant of your display, you will just need to tweak a few display settings. Try following these steps to fix your issue: Right-click on the CAVE.exe (or shortcut) icon, select Properties Select the Compatibility tab Click \"Change High DPI Settings\" At the bottom enable \"Override High DPI scaling behavior\" Change the dropdown from Application to System Windows CAVE Start Up Error \uf0c1 This should no longer be an issue for our v20 release of AWIPS. One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: Error purging logs Error instantiating workbench: null These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: These screenshots may vary from your system. These instructions are per connection , so if you use multiple connections or switch between wired and wireless connections, you'll need to do the following for each of those connections so that CAVE will always run properly. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. You may need to restart your machine for this to take effect 7. Restart CAVE. MacOS \uf0c1 Monterey CAVE Warning \uf0c1 If you are running MacOS Monterey, you may see the following message when starting CAVE: Monterey versions 12.3 or newer will not support our production (v18) CAVE. Please download and install our beta v20 CAVE for newer MacOS Versions to avoid this issue. White Boxes for Surface Resources \uf0c1 If you do not have an NVIDIA graphics card and driver, you may see \"boxes\" drawn on the editor for some of the products ( METARS Station Plots and Surface Winds are the resources we're aware of), as shown below: You may be able to fix this issue: Check what graphics cards are available on your machine, by going to the Apple menu (far left, upper corner) > About This Mac > Overview tab (default): If you see two entries at the Graphics line, like the image shown above, then you have two graphics cards on your system. Intel graphics cards may be able to render our products properly. In this case, you can \"force\" your computer to use the Intel card by running the following in a terminal: sudo pmset -[a|b|c] gpuswitch 0 Where [a|b|c] is only one of those options, which mean: a: adjust settings for all scenarios b: adjust settings while running off battery c: adjust settings while connected to charger The argument 0 sets the computer to use the dedicated GPU (in our case above the Intel GPU). The two other options for that argument are: 1: automatic graphics switching 2: integrated GPU It may be smart to run pmset -g first, so you can see what the current gpuswitch setting is (likely 1 ), that way you can revert the settings if you want them back to how they were, when not using CAVE. Linux \uf0c1 Troubleshooting Uninstalling EDEX \uf0c1 Sometimes yum can get in a weird state and not know what AWIPS groups have been installed. For example if you are trying to remove AWIPS you may see an error: yum groupremove \"AWIPS EDEX Server\" Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: mirror.dal.nexril.net * elrepo: ftp.osuosl.org * epel: mirrors.xmission.com * extras: mirrors.cat.pdx.edu * updates: mirror.mobap.edu No environment named AWIPS EDEX Server exists Maybe run: yum groups mark remove (see man yum) No packages to remove from groups To solve this issue, mark the group you want to remove and then try removing it again: yum groups mark remove \"AWIPS EDEX Server\" yum groupremove \"AWIPS EDEX Server\" Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 .","title":"Common Problems"},{"location":"appendix/common-problems/#common-problems","text":"","title":"Common Problems"},{"location":"appendix/common-problems/#all-operating-systems","text":"","title":"All Operating Systems"},{"location":"appendix/common-problems/#removing-cavedata","text":"Removing caveData (flushing the local cache) should be one of the first troubleshooting steps to take when experiencing weird behavior in CAVE. The cache lives in a folder called caveData , hence why this process is also referred to as removing or deleting caveData.","title":"Removing caveData"},{"location":"appendix/common-problems/#linux","text":"For Linux users, the easiest way is to open a new terminal and run the following command: rm -rf ~/caveData","title":"Linux"},{"location":"appendix/common-problems/#windows","text":"For Windows users, simply delete the caveData folder in your home user directory:","title":"Windows"},{"location":"appendix/common-problems/#mac","text":"For Mac users, the easiest way is to open a new terminal and run the following command: rm -rf ~/Library/caveData","title":"Mac"},{"location":"appendix/common-problems/#disappearing-configurations","text":"If you ever notice some of the following settings you've configured/saved disappear from CAVE: Saved Displays or Procedures NSHARP settings (line thickness, etc) Colormap settings StyleRule settings This is not a fully exhaustive list, so if something else has disappeared it might be the same underlying issue still. Then it is likely we have recently changed our production EDEX server. There is a good chance we can recover your settings. To do so, please send a short email to support-awips@unidata.ucar.edu with the topic \"Missing Configurations\", and include the username(s) of the computer(s) you use to run CAVE.","title":"Disappearing Configurations"},{"location":"appendix/common-problems/#remotely-connecting-to-cave","text":"Since the pandemic began, many users have asked if they can use X11 forwarding or ssh tunneling to remotely connect to CAVE machines. This is not recommended or supported , and CAVE crashes in many different ways and expresses strange behavior as well. We highly recommend you download the appropriate CAVE installer on your local machine, if that is an option. If that is not an option, then the only remote access we recommend is using some type of VNC. RealVNC and nomachine are two options that are in use with positive outcomes. UltraVNC may be another option, but may have quite a delay. There may also be other free or paid software available that we are not aware of. It is likely that any VNC option you choose will also require some software or configuration to be set on the remote machine, and this will likely require administrative privileges.","title":"Remotely Connecting to CAVE"},{"location":"appendix/common-problems/#cave-spring-start-up-error","text":"If you encounter the error below, please see one of our solution methods for resolving: CAVE's Spring container did not initialize correctly and CAVE must shut down. We have found the reason for this failure is because the host machine is set to use a language other than English (ie. Spanish, French, etc). To resolve this issue, either: Switch your system to English, when using CAVE or Use our Virtual Machine option . This option allows your actual machine to stay in whichever language you choose, while allowing you to run CAVE in an environment set to English. Although we list this installation under the Windows OS, this can also be done on Linux. The VM option has one notable drawback at the moment -- it cannot render RGB satellite products.","title":"CAVE Spring Start Up Error"},{"location":"appendix/common-problems/#products-not-loading-properly","text":"This problem is most commonly seen with the direct Windows installation. It can also manifest in the Mac installation (and is possible on Linux), and the root of the problem is not having Python installed properly for CAVE to use the packages. If the Windows installation was not completed properly, it is possible to see incorrect behavior when loading certain products. These are derived products which use the local machine to create and render the data. This creation is dependent upon python and its required packages working correctly. The dataset will be available in the menus and product browser, but when loaded, no data is drawn on the editor, but an entry is added to the legend. You may see an error that mentions the python package, jep . Known datasets this can affect (this is not a comprehensive list): Model Winds Metars Winds METAR Station Plot GFS Precip Type To correct this issue on Windows: Uninstall all related software (C++ Build Tools, Miniconda, Python, CAVE, pip, numpy, jep, etc) Redo all necessary installation instructions in steps 1 through 6 To correct this issue on Mac: Install the awips-python.pkg package found on step 1 To correct this issue on Linux: When running which python from a terminal, make sure /awips2/python/ is returned, if not, reset that environment variable, or re-run the awips_install.sh script from our installation instructions","title":"Products Not Loading Properly"},{"location":"appendix/common-problems/#windows_1","text":"","title":"Windows"},{"location":"appendix/common-problems/#cave-map-display-in-lower-left-quadrant-windows","text":"If you start up CAVE in Windows and notice the map is showing up only in the bottom left quadrant of your display, you will just need to tweak a few display settings. Try following these steps to fix your issue: Right-click on the CAVE.exe (or shortcut) icon, select Properties Select the Compatibility tab Click \"Change High DPI Settings\" At the bottom enable \"Override High DPI scaling behavior\" Change the dropdown from Application to System","title":"CAVE Map Display in Lower Left Quadrant - Windows"},{"location":"appendix/common-problems/#windows-cave-start-up-error","text":"This should no longer be an issue for our v20 release of AWIPS. One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: Error purging logs Error instantiating workbench: null These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: These screenshots may vary from your system. These instructions are per connection , so if you use multiple connections or switch between wired and wireless connections, you'll need to do the following for each of those connections so that CAVE will always run properly. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. You may need to restart your machine for this to take effect 7. Restart CAVE.","title":"Windows CAVE Start Up Error"},{"location":"appendix/common-problems/#macos","text":"","title":"MacOS"},{"location":"appendix/common-problems/#monterey-cave-warning","text":"If you are running MacOS Monterey, you may see the following message when starting CAVE: Monterey versions 12.3 or newer will not support our production (v18) CAVE. Please download and install our beta v20 CAVE for newer MacOS Versions to avoid this issue.","title":"Monterey CAVE Warning"},{"location":"appendix/common-problems/#white-boxes-for-surface-resources","text":"If you do not have an NVIDIA graphics card and driver, you may see \"boxes\" drawn on the editor for some of the products ( METARS Station Plots and Surface Winds are the resources we're aware of), as shown below: You may be able to fix this issue: Check what graphics cards are available on your machine, by going to the Apple menu (far left, upper corner) > About This Mac > Overview tab (default): If you see two entries at the Graphics line, like the image shown above, then you have two graphics cards on your system. Intel graphics cards may be able to render our products properly. In this case, you can \"force\" your computer to use the Intel card by running the following in a terminal: sudo pmset -[a|b|c] gpuswitch 0 Where [a|b|c] is only one of those options, which mean: a: adjust settings for all scenarios b: adjust settings while running off battery c: adjust settings while connected to charger The argument 0 sets the computer to use the dedicated GPU (in our case above the Intel GPU). The two other options for that argument are: 1: automatic graphics switching 2: integrated GPU It may be smart to run pmset -g first, so you can see what the current gpuswitch setting is (likely 1 ), that way you can revert the settings if you want them back to how they were, when not using CAVE.","title":"White Boxes for Surface Resources"},{"location":"appendix/common-problems/#linux_1","text":"","title":"Linux"},{"location":"appendix/common-problems/#troubleshooting-uninstalling-edex","text":"Sometimes yum can get in a weird state and not know what AWIPS groups have been installed. For example if you are trying to remove AWIPS you may see an error: yum groupremove \"AWIPS EDEX Server\" Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: mirror.dal.nexril.net * elrepo: ftp.osuosl.org * epel: mirrors.xmission.com * extras: mirrors.cat.pdx.edu * updates: mirror.mobap.edu No environment named AWIPS EDEX Server exists Maybe run: yum groups mark remove (see man yum) No packages to remove from groups To solve this issue, mark the group you want to remove and then try removing it again: yum groups mark remove \"AWIPS EDEX Server\" yum groupremove \"AWIPS EDEX Server\" Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 .","title":"Troubleshooting Uninstalling EDEX"},{"location":"appendix/educational-resources/","text":"Educational Resources \uf0c1 Here at Unidata, we want to provide as many resources as possible to make our tools and applications easy to use. For AWIPS we currently have a new eLearning course that is specific to CAVE. We also have a suite of Jupyter Notebooks that are meant to provide a detailed overview of many capabilities of python-awips. CAVE eLearning Course \uf0c1 Learn AWIPS CAVE is our online educational course for those interested in learning about CAVE. Access \uf0c1 Please create an account on Unidata eLearning , then self-enroll in Learn AWIPS CAVE . Content \uf0c1 Learn AWIPS CAVE is specifically tailored to content regarding CAVE -- the local graphical application used to view weather data. The following topics and capabilities are covered throughout the course: Launching CAVE Navigating the interface Modifying product appearances Understanding the time match basis Creating publication-quality graphics Exploring various CAVE layouts Saving and loading procedures and displays Using radar displays Using baselines and points Creating time series displays Creating vertical cross section displays Using the NSHARP editor for soundings Viewing model soundings Prerequisites \uf0c1 Required: A supported web browser CAVE version 18.2.1 installed on a supported operating system Recommended: A keyboard with a numpad and mouse with a scrollwheel Second monitor Design \uf0c1 Learn AWIPS CAVE is designed for those new to AWIPS or for those seeking to learn best practices. The course is organized into modular sections with supporting lessons, allowing for spaced learning or completion in multiple class or lab sessions. Each section concludes with a quiz to assess learning, and results can be requested by instructors or supervisors for their classes/teams. Below is a snapshot taken from the course. Lessons are tied to relevant learning objectives . Lessons are scaffolded such that each skill builds upon the next. Tutorials, challenges, and assessments are designed to support higher-order thinking skills and learning retention. Support \uf0c1 If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu Python-AWIPS eLearning Course \uf0c1 Learn Python-AWIPS is our online educational course for those interested in learning about Python-AWIPS . Access \uf0c1 Please create an account on Unidata eLearning , then self-enroll in Learn Python-AWIPS . Content \uf0c1 Learn Python-AWIPS is designed for new users of Python-AWIPS who have some background in both Python and CAVE. Through tutorials, challenges, and demonstrations, you will learn the basics for working with EDEX resources through Python. The following topics and capabilities are covered throughout the course: Programmatically explore the resources available on an EDEX server Make a request to an EDEX for data See examples of data manipulation Plot requested data Prerequisites \uf0c1 Required: A supported web browser Python3 Conda Git Python-AWIPS using the Source Code with Examples Install instructions Support \uf0c1 If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu Python-AWIPS Example Notebooks \uf0c1 In addition to CAVE, AWIPS also has a Python package called python-awips which allows access to all data on an EDEX server. We have created a suite of Jupyter Notebooks as examples for how to use various functions of python-awips. Access \uf0c1 All of our Notebooks can be downloaded and accessed locally by following the source code installation instructions found on our python-awips website . Additionally, non-interactive webpage renderings of each of the Notebooks are also available for quick and easy references. Content \uf0c1 Our python-awips Notebooks span a wide range of topics, but generally cover the following: Investigating what data is available on an EDEX server Accessing and filtering desired data based on time and location Plotting and analyzing datasets Specific examples for various data types: satellite imagery, model data, soundings, surface obs, and more YouTube Channel and Playlist \uf0c1 Unidata has a YouTube channel where we publish videos about all of our software pacakges. Specifically we also have a playlist dedicated to AWIPS videos. Access \uf0c1 All Unidata vidoes can be accessed here on our channel. All AWIPS vidoes can be found on the AWIPS Playlist . Content \uf0c1 Our AWIPS videos cover a wide range of topics, but include some of the following themes: AWIPS topic overviews Instructional videos (ex. how to install CAVE) In-depth walkthroughs on CAVE functionality Python-AWIPS notebook examples AWIPS Tips Blog Series \uf0c1 AWIPS Tips is a bi-weekly (every two weeks) blog series that is posted on our Unidata blogs page. Entries in the series cover topics relating to CAVE, python-awips, EDEX, and more. Access \uf0c1 View all of the AWIPS Tips blogs here , and easily search for them using the awips-tips tag. Please join our mailing list (awips2-users) to get the notifications of new AWIPS Tips when they come out! Content \uf0c1 A full list of all released blogs can be found below: General \uf0c1 Welcome to AWIPS Tips! AWIPS 18.2.1 Software Release Announcing AWIPS eLearning AWIPS 18.2.1-3 Software Release Access Learn AWIPS CAVE from Unidata eLearning AWIPS 18.2.1-5 Software Release GLM DATA IDD/LDM Feed Updates AWIPS 18.2.1-6 Software Release Unidata AWIPS Summer Internship 2022: Rhoen Fiutak Announcing a New eLearning Course: Learn Python-AWIPS Use Case Example: Texas A&M CAVE in the Classroom AWIPS 20.3.2-0.1 Beta CAVE Software Release AWIPS 20.3.2-0.2 Beta CAVE Software Release AWIPS 20.3.2-0.3 Beta CAVE Software Release AWIPS 20.3.2-0.4 Beta Software Release - with EDEX! CAVE \uf0c1 Visualizing Data in CAVE Display Capabilities in CAVE Time Tips Explore the CAVE Product Browser CAVE's Local Cache: caveData Explore the CAVE Volume Browser: Plan Views Using CAVE's Points and Baselines Tool Explore the CAVE Volume Browser: Cross Section and Time Series Using CAVE Displays and Procedures Getting Started With the NSHARP Display Tool Explore the CAVE Volume Browser: Model Soundings NUCAPS Soundings Import Shapefiles in CAVE Create Objective Analysis Plots Use Warngen to Draw Convective Warnings Using Drawing Properties for WWA Display in CAVE Understanding Graphic vs Image Products in CAVE Getting to Know CAVE's Display Properties Creating a User Override Frames in CAVE Panes in CAVE Image Combination with CAVE Colorized GOES CIRA Products Changing Localizations in CAVE All About Sampling Maps Database Constraints Python-AWIPS \uf0c1 Access Model Output with Python-AWIPS Plot New GOES Products From Unidata's Public EDEX Load Map Resources and Topography using Python-AWIPS Create a Colored Surface Temperature Plot Create Colorized Model Plots View WWA Polygons with Python-AWIPS Creating METAR Station Plots Create Sounding Plots with Model Data Plotting Multiple Datasets from EDEX Open Jupyter Notebooks with our Virtual Machine Visualizing Upper Air Soundings Compare Model Sounding Data in Python Beta Python-AWIPS Release EDEX \uf0c1 Get to Know EDEX EDEX Data Retention Adding ECMWF Data to EDEX Ingesting GOES Satellite Data Localization Levels in EDEX Porting Users CAVE Configurations Creating New Scales/Maps Adding Shapefiles to the Maps Menu with EDEX Removing Model Data from EDEX LDM Usage in AWIPS","title":"Educational Resources"},{"location":"appendix/educational-resources/#educational-resources","text":"Here at Unidata, we want to provide as many resources as possible to make our tools and applications easy to use. For AWIPS we currently have a new eLearning course that is specific to CAVE. We also have a suite of Jupyter Notebooks that are meant to provide a detailed overview of many capabilities of python-awips.","title":"Educational Resources"},{"location":"appendix/educational-resources/#cave-elearning-course","text":"Learn AWIPS CAVE is our online educational course for those interested in learning about CAVE.","title":"CAVE eLearning Course"},{"location":"appendix/educational-resources/#access","text":"Please create an account on Unidata eLearning , then self-enroll in Learn AWIPS CAVE .","title":"Access"},{"location":"appendix/educational-resources/#content","text":"Learn AWIPS CAVE is specifically tailored to content regarding CAVE -- the local graphical application used to view weather data. The following topics and capabilities are covered throughout the course: Launching CAVE Navigating the interface Modifying product appearances Understanding the time match basis Creating publication-quality graphics Exploring various CAVE layouts Saving and loading procedures and displays Using radar displays Using baselines and points Creating time series displays Creating vertical cross section displays Using the NSHARP editor for soundings Viewing model soundings","title":"Content"},{"location":"appendix/educational-resources/#prerequisites","text":"Required: A supported web browser CAVE version 18.2.1 installed on a supported operating system Recommended: A keyboard with a numpad and mouse with a scrollwheel Second monitor","title":"Prerequisites"},{"location":"appendix/educational-resources/#design","text":"Learn AWIPS CAVE is designed for those new to AWIPS or for those seeking to learn best practices. The course is organized into modular sections with supporting lessons, allowing for spaced learning or completion in multiple class or lab sessions. Each section concludes with a quiz to assess learning, and results can be requested by instructors or supervisors for their classes/teams. Below is a snapshot taken from the course. Lessons are tied to relevant learning objectives . Lessons are scaffolded such that each skill builds upon the next. Tutorials, challenges, and assessments are designed to support higher-order thinking skills and learning retention.","title":"Design"},{"location":"appendix/educational-resources/#support","text":"If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu","title":"Support"},{"location":"appendix/educational-resources/#python-awips-elearning-course","text":"Learn Python-AWIPS is our online educational course for those interested in learning about Python-AWIPS .","title":"Python-AWIPS eLearning Course"},{"location":"appendix/educational-resources/#access_1","text":"Please create an account on Unidata eLearning , then self-enroll in Learn Python-AWIPS .","title":"Access"},{"location":"appendix/educational-resources/#content_1","text":"Learn Python-AWIPS is designed for new users of Python-AWIPS who have some background in both Python and CAVE. Through tutorials, challenges, and demonstrations, you will learn the basics for working with EDEX resources through Python. The following topics and capabilities are covered throughout the course: Programmatically explore the resources available on an EDEX server Make a request to an EDEX for data See examples of data manipulation Plot requested data","title":"Content"},{"location":"appendix/educational-resources/#prerequisites_1","text":"Required: A supported web browser Python3 Conda Git Python-AWIPS using the Source Code with Examples Install instructions","title":"Prerequisites"},{"location":"appendix/educational-resources/#support_1","text":"If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu","title":"Support"},{"location":"appendix/educational-resources/#python-awips-example-notebooks","text":"In addition to CAVE, AWIPS also has a Python package called python-awips which allows access to all data on an EDEX server. We have created a suite of Jupyter Notebooks as examples for how to use various functions of python-awips.","title":"Python-AWIPS Example Notebooks"},{"location":"appendix/educational-resources/#access_2","text":"All of our Notebooks can be downloaded and accessed locally by following the source code installation instructions found on our python-awips website . Additionally, non-interactive webpage renderings of each of the Notebooks are also available for quick and easy references.","title":"Access"},{"location":"appendix/educational-resources/#content_2","text":"Our python-awips Notebooks span a wide range of topics, but generally cover the following: Investigating what data is available on an EDEX server Accessing and filtering desired data based on time and location Plotting and analyzing datasets Specific examples for various data types: satellite imagery, model data, soundings, surface obs, and more","title":"Content"},{"location":"appendix/educational-resources/#youtube-channel-and-playlist","text":"Unidata has a YouTube channel where we publish videos about all of our software pacakges. Specifically we also have a playlist dedicated to AWIPS videos.","title":"YouTube Channel and Playlist"},{"location":"appendix/educational-resources/#access_3","text":"All Unidata vidoes can be accessed here on our channel. All AWIPS vidoes can be found on the AWIPS Playlist .","title":"Access"},{"location":"appendix/educational-resources/#content_3","text":"Our AWIPS videos cover a wide range of topics, but include some of the following themes: AWIPS topic overviews Instructional videos (ex. how to install CAVE) In-depth walkthroughs on CAVE functionality Python-AWIPS notebook examples","title":"Content"},{"location":"appendix/educational-resources/#awips-tips-blog-series","text":"AWIPS Tips is a bi-weekly (every two weeks) blog series that is posted on our Unidata blogs page. Entries in the series cover topics relating to CAVE, python-awips, EDEX, and more.","title":"AWIPS Tips Blog Series"},{"location":"appendix/educational-resources/#access_4","text":"View all of the AWIPS Tips blogs here , and easily search for them using the awips-tips tag. Please join our mailing list (awips2-users) to get the notifications of new AWIPS Tips when they come out!","title":"Access"},{"location":"appendix/educational-resources/#content_4","text":"A full list of all released blogs can be found below:","title":"Content"},{"location":"appendix/educational-resources/#general","text":"Welcome to AWIPS Tips! AWIPS 18.2.1 Software Release Announcing AWIPS eLearning AWIPS 18.2.1-3 Software Release Access Learn AWIPS CAVE from Unidata eLearning AWIPS 18.2.1-5 Software Release GLM DATA IDD/LDM Feed Updates AWIPS 18.2.1-6 Software Release Unidata AWIPS Summer Internship 2022: Rhoen Fiutak Announcing a New eLearning Course: Learn Python-AWIPS Use Case Example: Texas A&M CAVE in the Classroom AWIPS 20.3.2-0.1 Beta CAVE Software Release AWIPS 20.3.2-0.2 Beta CAVE Software Release AWIPS 20.3.2-0.3 Beta CAVE Software Release AWIPS 20.3.2-0.4 Beta Software Release - with EDEX!","title":"General"},{"location":"appendix/educational-resources/#cave","text":"Visualizing Data in CAVE Display Capabilities in CAVE Time Tips Explore the CAVE Product Browser CAVE's Local Cache: caveData Explore the CAVE Volume Browser: Plan Views Using CAVE's Points and Baselines Tool Explore the CAVE Volume Browser: Cross Section and Time Series Using CAVE Displays and Procedures Getting Started With the NSHARP Display Tool Explore the CAVE Volume Browser: Model Soundings NUCAPS Soundings Import Shapefiles in CAVE Create Objective Analysis Plots Use Warngen to Draw Convective Warnings Using Drawing Properties for WWA Display in CAVE Understanding Graphic vs Image Products in CAVE Getting to Know CAVE's Display Properties Creating a User Override Frames in CAVE Panes in CAVE Image Combination with CAVE Colorized GOES CIRA Products Changing Localizations in CAVE All About Sampling Maps Database Constraints","title":"CAVE"},{"location":"appendix/educational-resources/#python-awips","text":"Access Model Output with Python-AWIPS Plot New GOES Products From Unidata's Public EDEX Load Map Resources and Topography using Python-AWIPS Create a Colored Surface Temperature Plot Create Colorized Model Plots View WWA Polygons with Python-AWIPS Creating METAR Station Plots Create Sounding Plots with Model Data Plotting Multiple Datasets from EDEX Open Jupyter Notebooks with our Virtual Machine Visualizing Upper Air Soundings Compare Model Sounding Data in Python Beta Python-AWIPS Release","title":"Python-AWIPS"},{"location":"appendix/educational-resources/#edex","text":"Get to Know EDEX EDEX Data Retention Adding ECMWF Data to EDEX Ingesting GOES Satellite Data Localization Levels in EDEX Porting Users CAVE Configurations Creating New Scales/Maps Adding Shapefiles to the Maps Menu with EDEX Removing Model Data from EDEX LDM Usage in AWIPS","title":"EDEX"},{"location":"appendix/maps-database/","text":"mapdata.airport \uf0c1 Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok mapdata.allrivers \uf0c1 Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326) mapdata.artcc \uf0c1 Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326) mapdata.basins \uf0c1 Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326) mapdata.canada \uf0c1 Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326) mapdata.city \uf0c1 Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326) mapdata.county \uf0c1 Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.customlocations \uf0c1 Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.cwa \uf0c1 Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.firewxaor \uf0c1 Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.firewxzones \uf0c1 Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.fix \uf0c1 Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) mapdata.highaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.highsea \uf0c1 Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.highway \uf0c1 Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.hsa \uf0c1 Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.interstate \uf0c1 Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.isc \uf0c1 Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.lake \uf0c1 Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.latlon10 \uf0c1 Column Type the_geom geometry(MultiLineString,4326) mapdata.lowaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.majorrivers \uf0c1 Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326) mapdata.marinesites \uf0c1 Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326) mapdata.marinezones \uf0c1 Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.mexico \uf0c1 Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326) mapdata.navaid \uf0c1 Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326) mapdata.offshore \uf0c1 Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326) mapdata.railroad \uf0c1 Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326) mapdata.rfc \uf0c1 Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.specialuse \uf0c1 Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.states \uf0c1 Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.timezones \uf0c1 Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.warngenloc \uf0c1 Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326) mapdata.world \uf0c1 Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326) mapdata.zone \uf0c1 Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"Maps database"},{"location":"appendix/maps-database/#mapdataairport","text":"Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok","title":"mapdata.airport"},{"location":"appendix/maps-database/#mapdataallrivers","text":"Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326)","title":"mapdata.allrivers"},{"location":"appendix/maps-database/#mapdataartcc","text":"Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.artcc"},{"location":"appendix/maps-database/#mapdatabasins","text":"Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326)","title":"mapdata.basins"},{"location":"appendix/maps-database/#mapdatacanada","text":"Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.canada"},{"location":"appendix/maps-database/#mapdatacity","text":"Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326)","title":"mapdata.city"},{"location":"appendix/maps-database/#mapdatacounty","text":"Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.county"},{"location":"appendix/maps-database/#mapdatacustomlocations","text":"Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.customlocations"},{"location":"appendix/maps-database/#mapdatacwa","text":"Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.cwa"},{"location":"appendix/maps-database/#mapdatafirewxaor","text":"Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxaor"},{"location":"appendix/maps-database/#mapdatafirewxzones","text":"Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxzones"},{"location":"appendix/maps-database/#mapdatafix","text":"Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.fix"},{"location":"appendix/maps-database/#mapdatahighaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.highaltitude"},{"location":"appendix/maps-database/#mapdatahighsea","text":"Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.highsea"},{"location":"appendix/maps-database/#mapdatahighway","text":"Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.highway"},{"location":"appendix/maps-database/#mapdatahsa","text":"Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.hsa"},{"location":"appendix/maps-database/#mapdatainterstate","text":"Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.interstate"},{"location":"appendix/maps-database/#mapdataisc","text":"Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.isc"},{"location":"appendix/maps-database/#mapdatalake","text":"Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.lake"},{"location":"appendix/maps-database/#mapdatalatlon10","text":"Column Type the_geom geometry(MultiLineString,4326)","title":"mapdata.latlon10"},{"location":"appendix/maps-database/#mapdatalowaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.lowaltitude"},{"location":"appendix/maps-database/#mapdatamajorrivers","text":"Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326)","title":"mapdata.majorrivers"},{"location":"appendix/maps-database/#mapdatamarinesites","text":"Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326)","title":"mapdata.marinesites"},{"location":"appendix/maps-database/#mapdatamarinezones","text":"Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.marinezones"},{"location":"appendix/maps-database/#mapdatamexico","text":"Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326)","title":"mapdata.mexico"},{"location":"appendix/maps-database/#mapdatanavaid","text":"Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.navaid"},{"location":"appendix/maps-database/#mapdataoffshore","text":"Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326)","title":"mapdata.offshore"},{"location":"appendix/maps-database/#mapdatarailroad","text":"Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326)","title":"mapdata.railroad"},{"location":"appendix/maps-database/#mapdatarfc","text":"Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.rfc"},{"location":"appendix/maps-database/#mapdataspecialuse","text":"Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.specialuse"},{"location":"appendix/maps-database/#mapdatastates","text":"Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.states"},{"location":"appendix/maps-database/#mapdatatimezones","text":"Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.timezones"},{"location":"appendix/maps-database/#mapdatawarngenloc","text":"Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326)","title":"mapdata.warngenloc"},{"location":"appendix/maps-database/#mapdataworld","text":"Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326)","title":"mapdata.world"},{"location":"appendix/maps-database/#mapdatazone","text":"Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.zone"},{"location":"cave/bundles-and-procedures/","text":"Displays and Procedures \uf0c1 AWIPS contains two methods for saving and loading data resources: Displays are an all-encompasing way to save loaded resources and current view configurations either onto the connected EDEX server, or a local file for access in future CAVE sessions. Procedures are similar to Displays, but can be thought of as groups of procedure items which allow the user to save/load only parts of the procedure they desire, and allows the user to manage saved resources with more control. Displays \uf0c1 File > Load Display \uf0c1 Load a previously-saved display from within the AWIPS system. The pop-up dialog allows you to select your own saved displays as well as those saved by other users. When loading a display, all existing tabs will be closed and replaced with the contents from the saved display. Displays will load as many Map Editor tabs as existed when the display was originally saved. Load Display from Local Disk \uf0c1 To load a previously-saved display from a path within the file directory locally, select File > Load Display and then select the File button on the right to browse your local directories. File > Save Display \uf0c1 Save a product display within the AWIPS system. This saves the display to the EDEX server for your specific user. File > Save Display Locally \uf0c1 To save a product display to a path within the file directory locally, select File > Save Display Locally and then select the File button on the right. File > Delete Displays \uf0c1 Select and remove a saved display under File > Delete Displays , this will open a pop-up dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently. Procedures \uf0c1 New Procedure \uf0c1 Select the menu File > Procedures > New... Select Copy Into to add all loaded resources from your current map to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save. Open Procedure \uf0c1 Similar to creating a new Procedure, select File > Procedures > Open... , select the saved resources and click Load to load them to the current Map Editor tab. If multiple procedure items are wanted for loading, create a new tab for each procedure item and then load that item into the tab. This process is shown in the video below. Delete Procedure \uf0c1 From the menu File > Procedures > Delete... you can delete existing Procedure files in a way similar to deleting saved display files.","title":"Displays and Procedures"},{"location":"cave/bundles-and-procedures/#displays-and-procedures","text":"AWIPS contains two methods for saving and loading data resources: Displays are an all-encompasing way to save loaded resources and current view configurations either onto the connected EDEX server, or a local file for access in future CAVE sessions. Procedures are similar to Displays, but can be thought of as groups of procedure items which allow the user to save/load only parts of the procedure they desire, and allows the user to manage saved resources with more control.","title":"Displays and Procedures"},{"location":"cave/bundles-and-procedures/#displays","text":"","title":"Displays"},{"location":"cave/bundles-and-procedures/#file-load-display","text":"Load a previously-saved display from within the AWIPS system. The pop-up dialog allows you to select your own saved displays as well as those saved by other users. When loading a display, all existing tabs will be closed and replaced with the contents from the saved display. Displays will load as many Map Editor tabs as existed when the display was originally saved.","title":"File > Load Display"},{"location":"cave/bundles-and-procedures/#load-display-from-local-disk","text":"To load a previously-saved display from a path within the file directory locally, select File > Load Display and then select the File button on the right to browse your local directories.","title":"Load Display from Local Disk"},{"location":"cave/bundles-and-procedures/#file-save-display","text":"Save a product display within the AWIPS system. This saves the display to the EDEX server for your specific user.","title":"File > Save Display"},{"location":"cave/bundles-and-procedures/#file-save-display-locally","text":"To save a product display to a path within the file directory locally, select File > Save Display Locally and then select the File button on the right.","title":"File > Save Display Locally"},{"location":"cave/bundles-and-procedures/#file-delete-displays","text":"Select and remove a saved display under File > Delete Displays , this will open a pop-up dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently.","title":"File > Delete Displays"},{"location":"cave/bundles-and-procedures/#procedures","text":"","title":"Procedures"},{"location":"cave/bundles-and-procedures/#new-procedure","text":"Select the menu File > Procedures > New... Select Copy Into to add all loaded resources from your current map to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save.","title":"New Procedure"},{"location":"cave/bundles-and-procedures/#open-procedure","text":"Similar to creating a new Procedure, select File > Procedures > Open... , select the saved resources and click Load to load them to the current Map Editor tab. If multiple procedure items are wanted for loading, create a new tab for each procedure item and then load that item into the tab. This process is shown in the video below.","title":"Open Procedure"},{"location":"cave/bundles-and-procedures/#delete-procedure","text":"From the menu File > Procedures > Delete... you can delete existing Procedure files in a way similar to deleting saved display files.","title":"Delete Procedure"},{"location":"cave/cave-keyboard-shortcuts/","text":"Keyboard Shortcuts \uf0c1 D2D Menu Shortcuts \uf0c1 Action Command Open a New Map Ctrl + N Open a Display Ctrl + O Save Display Ctrl + S Save Display Locally Ctrl + Shift + S Save KML Ctrl + K Exit CAVE Alt + F4 Exit CAVE Ctrl + Q Clear Data Ctrl + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2190 Increase Loop Speed Page Up Decrease Loop Speed Page Down Open Time Options Ctrl + T Toggle Image Combination Insert Open Loop Properties Ctrl + L Open Image Properties Ctrl + I D2D All Tilts Shortcuts \uf0c1 Note : Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193 D2D Numeric Keypad Shortcuts \uf0c1 Note : Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1 and 2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter Panel Combo Rotate (PCR) Shortcuts \uf0c1 Note : These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8 Text Editor Shortcuts \uf0c1 Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X","title":"Keyboard Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#keyboard-shortcuts","text":"","title":"Keyboard Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-menu-shortcuts","text":"Action Command Open a New Map Ctrl + N Open a Display Ctrl + O Save Display Ctrl + S Save Display Locally Ctrl + Shift + S Save KML Ctrl + K Exit CAVE Alt + F4 Exit CAVE Ctrl + Q Clear Data Ctrl + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2190 Increase Loop Speed Page Up Decrease Loop Speed Page Down Open Time Options Ctrl + T Toggle Image Combination Insert Open Loop Properties Ctrl + L Open Image Properties Ctrl + I","title":"D2D Menu Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-all-tilts-shortcuts","text":"Note : Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193","title":"D2D All Tilts Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-numeric-keypad-shortcuts","text":"Note : Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1 and 2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter","title":"D2D Numeric Keypad Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#panel-combo-rotate-pcr-shortcuts","text":"Note : These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8","title":"Panel Combo Rotate (PCR) Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#text-editor-shortcuts","text":"Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X","title":"Text Editor Shortcuts"},{"location":"cave/cave-localization/","text":"Change Localization \uf0c1 Localization Preferences \uf0c1 The default localization site for Unidata AWIPS is OAX (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. Since release 16.1.4, CAVE users can switch the localization site to any valid NWS WFO from CAVE > Preferences > Localization , where edits can be made to both the site ID and EDEX server name. Click Restart after changes are applied. This window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.","title":"Change Localization"},{"location":"cave/cave-localization/#change-localization","text":"","title":"Change Localization"},{"location":"cave/cave-localization/#localization-preferences","text":"The default localization site for Unidata AWIPS is OAX (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. Since release 16.1.4, CAVE users can switch the localization site to any valid NWS WFO from CAVE > Preferences > Localization , where edits can be made to both the site ID and EDEX server name. Click Restart after changes are applied. This window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.","title":"Localization Preferences"},{"location":"cave/cave-perspectives/","text":"D2D \uf0c1 D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. Frame control, map projection, image properties, and a few featured applications make up the the primary D2D toolbar. CONUS is the default map display of the continental United States in a North Polar Stereographic projection. This menu allows you to select different projections. Clear will remove all non-system resources (meaning data) while preserving any map overlays you have added to the view. is a shortcut to Image Properties for the top-loaded image resource in the stack. freezes and un-freezes panning (movement) of the map. Valid time seq is the default time-matching setting for loading data. Select this menu to switch to configurations such as Latest, No Backfill, Previous run, Prognosis loop, and more. controls the frame number, display, speed, etc. You can also control the frames with the left and right keyboard keys. Application links to Warngen , Ncdata (NCP GEMPAK-like grids), Nsharp , and the Product Browser are also available. Switching Perspectives \uf0c1 D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , Hydro , Localization , MPE , or National Centers Perspective (which is available in the Other... submenu. Nobody seems to know why the NCP is not listed with the other perspectives, or how to make it appear with them). Resource Stack \uf0c1 At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . Left-Click Resource Name to Hide \uf0c1 A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible. Hold-Right-Click Resource Name for Menu \uf0c1 Drag the mouse over a loaded resource and hold the right mouse button until a menu appears (simply clicking the resource with the right mouse button will toggle its visibility). The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. Hold-Right-Click the Map Background \uf0c1 for additional options, such as greater control over the resource stack legend, toggling a 4-panel display , selecting a Zoom level, and setting a Background Color . Most loaded resources will also have a menu option for reading out the pixel values: Product Browser \uf0c1 The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. Selections for GFE , Grids , Lightning , Map Overlays , Radar , Satellite , Redbook , and VIIRS are available. All products loaded with the Product Browser are given default settings.","title":"D2D"},{"location":"cave/cave-perspectives/#d2d","text":"D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. Frame control, map projection, image properties, and a few featured applications make up the the primary D2D toolbar. CONUS is the default map display of the continental United States in a North Polar Stereographic projection. This menu allows you to select different projections. Clear will remove all non-system resources (meaning data) while preserving any map overlays you have added to the view. is a shortcut to Image Properties for the top-loaded image resource in the stack. freezes and un-freezes panning (movement) of the map. Valid time seq is the default time-matching setting for loading data. Select this menu to switch to configurations such as Latest, No Backfill, Previous run, Prognosis loop, and more. controls the frame number, display, speed, etc. You can also control the frames with the left and right keyboard keys. Application links to Warngen , Ncdata (NCP GEMPAK-like grids), Nsharp , and the Product Browser are also available.","title":"D2D"},{"location":"cave/cave-perspectives/#switching-perspectives","text":"D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , Hydro , Localization , MPE , or National Centers Perspective (which is available in the Other... submenu. Nobody seems to know why the NCP is not listed with the other perspectives, or how to make it appear with them).","title":"Switching Perspectives"},{"location":"cave/cave-perspectives/#resource-stack","text":"At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu .","title":"Resource Stack"},{"location":"cave/cave-perspectives/#left-click-resource-name-to-hide","text":"A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.","title":"Left-Click Resource Name to Hide"},{"location":"cave/cave-perspectives/#hold-right-click-resource-name-for-menu","text":"Drag the mouse over a loaded resource and hold the right mouse button until a menu appears (simply clicking the resource with the right mouse button will toggle its visibility). The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources.","title":"Hold-Right-Click Resource Name for Menu"},{"location":"cave/cave-perspectives/#hold-right-click-the-map-background","text":"for additional options, such as greater control over the resource stack legend, toggling a 4-panel display , selecting a Zoom level, and setting a Background Color . Most loaded resources will also have a menu option for reading out the pixel values:","title":"Hold-Right-Click the Map Background"},{"location":"cave/cave-perspectives/#product-browser","text":"The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. Selections for GFE , Grids , Lightning , Map Overlays , Radar , Satellite , Redbook , and VIIRS are available. All products loaded with the Product Browser are given default settings.","title":"Product Browser"},{"location":"cave/d2d-edit-menus/","text":"Editing Menus \uf0c1 Any of the menus in the menubar can be customized in the Localization Perspective . Modifying Menus \uf0c1 Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and select Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS Removing Menus \uf0c1 This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change to With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.","title":"Editing Menus"},{"location":"cave/d2d-edit-menus/#editing-menus","text":"Any of the menus in the menubar can be customized in the Localization Perspective .","title":"Editing Menus"},{"location":"cave/d2d-edit-menus/#modifying-menus","text":"Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and select Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS","title":"Modifying Menus"},{"location":"cave/d2d-edit-menus/#removing-menus","text":"This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change to With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.","title":"Removing Menus"},{"location":"cave/d2d-gis-shapefiles/","text":"GIS Import \uf0c1 The Geographic Information System (GIS) Import menu entry enables users to import geospatial data from varying GIS data sources for display in CAVE. CAVE currently only supports shape data in WGS84 unprojected latitude/longitude. This section describes how to: Load GIS Data in CAVE Modify the GIS Data Preferences Customize the Attributes Label GIS Data Display GIS Data \uf0c1 Importing a GIS shapefile is accessed through File > Import > GIS Data . The GIS DataStore Parameters dialog is comprised of four sections: DataStore Type : You can select a file type from the dropdown list. The only option is GIS File . Connection Parameters : Click the Browse button and navigate to the directory where your shapefiles are. Pressing Connect will populate the available shapefiles. Load As : Shapefiles can be loaded as a Map or as a Product. Map : The selected shapefile displays as a map, similar to if you load a map from the Maps menu. Product : When this radio button is selected, you will also need to select the start and end date/time the data is valid for. The selected shapefile displays as a product with a shaded (color-filled) image. When plotting with additional products, if the display time falls within the start/end time range selected, the shapefile will display. When the valid time falls outside the start/end time, the map product image does not display. Table : This section lists all of the available shapefiles that are available for display. GIS Data Preferences \uf0c1 Updating GIS display preferences is accessed through CAVE > Preferences > GIS Viewer . You are able to alter the highlight color, style, width, and opacity of the product in the Main Display here. Customizing the GIS Attribute Dialog \uf0c1 You have the ability to highlight or hide specific areas of the displayed map. These functionalities are available by right click and holding on the Map Product ID in the Legend area and selecting Display Attributes . The pop-up window is commonly referred to as the \"Attributes Table\". For each row of information there is an associated map/map product image displayed on the Main Display Pane. Highlighting \uf0c1 Highlighting Selected Areas \uf0c1 To highlight a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Highlighted checkbox. Active highlighted rows will be yellow in the table and the corresponding area in the map display will be pink. Unhighlighting Selected Areas \uf0c1 You can unhighlight by selecting the row, right mouse hold and uncheck the Highlighted checkbox. Unhighlighting All Areas \uf0c1 To remove all highlighted, select Annotation > Clear Highlights . If you are interested in a particular area in the Main Display Pane, but don't know the where in the Attributes Table it is, left double-click on the area of interest and the corresponding row will be highlighted. Controlling Visibility of Image Areas \uf0c1 Hiding Selected Areas \uf0c1 To hide a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Visible checkbox. Hidden rows will be gray in the table and the corresponding area in the map display will disappear. Unhiding Selected Areas \uf0c1 You can make these images visible by selecting the row, right mouse hold and check the Visible checkbox. Unhiding All Areas \uf0c1 To make all images visible, select Annotation > Make All Visible . Configuring Attributes Table \uf0c1 In the Attributes Table, you have the option to sort by columns and select which columns are displayed. Selecting Columns to Display \uf0c1 By default, all available columns are displayed. The Select Columns dialog will pop-up if you select Data > Select Columns... . You can highlight the columns use the arrows to move them into the Available or Displayed columns. Clicking OK will update your table. Sorting Column Information \uf0c1 The Sort Order dialog will pop-up if you select Data > Sort... . You can use the drop down menu to choose the column to sort by and then sort by Ascending or Descending. You can sort by additional columns. Clicking OK will update your table. Labeling GIS Data \uf0c1 You can select which attribute you want to use to label the objects on the Main Display. To open the Label submenu, right click and hold on the Map Product ID in the Legend area to open a pop-up menu and select Label and choose which attribute you want as the label.","title":"GIS and Shapefiles"},{"location":"cave/d2d-gis-shapefiles/#gis-import","text":"The Geographic Information System (GIS) Import menu entry enables users to import geospatial data from varying GIS data sources for display in CAVE. CAVE currently only supports shape data in WGS84 unprojected latitude/longitude. This section describes how to: Load GIS Data in CAVE Modify the GIS Data Preferences Customize the Attributes Label GIS Data","title":"GIS Import"},{"location":"cave/d2d-gis-shapefiles/#display-gis-data","text":"Importing a GIS shapefile is accessed through File > Import > GIS Data . The GIS DataStore Parameters dialog is comprised of four sections: DataStore Type : You can select a file type from the dropdown list. The only option is GIS File . Connection Parameters : Click the Browse button and navigate to the directory where your shapefiles are. Pressing Connect will populate the available shapefiles. Load As : Shapefiles can be loaded as a Map or as a Product. Map : The selected shapefile displays as a map, similar to if you load a map from the Maps menu. Product : When this radio button is selected, you will also need to select the start and end date/time the data is valid for. The selected shapefile displays as a product with a shaded (color-filled) image. When plotting with additional products, if the display time falls within the start/end time range selected, the shapefile will display. When the valid time falls outside the start/end time, the map product image does not display. Table : This section lists all of the available shapefiles that are available for display.","title":"Display GIS Data"},{"location":"cave/d2d-gis-shapefiles/#gis-data-preferences","text":"Updating GIS display preferences is accessed through CAVE > Preferences > GIS Viewer . You are able to alter the highlight color, style, width, and opacity of the product in the Main Display here.","title":"GIS Data Preferences"},{"location":"cave/d2d-gis-shapefiles/#customizing-the-gis-attribute-dialog","text":"You have the ability to highlight or hide specific areas of the displayed map. These functionalities are available by right click and holding on the Map Product ID in the Legend area and selecting Display Attributes . The pop-up window is commonly referred to as the \"Attributes Table\". For each row of information there is an associated map/map product image displayed on the Main Display Pane.","title":"Customizing the GIS Attribute Dialog"},{"location":"cave/d2d-gis-shapefiles/#highlighting","text":"","title":"Highlighting"},{"location":"cave/d2d-gis-shapefiles/#highlighting-selected-areas","text":"To highlight a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Highlighted checkbox. Active highlighted rows will be yellow in the table and the corresponding area in the map display will be pink.","title":"Highlighting Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhighlighting-selected-areas","text":"You can unhighlight by selecting the row, right mouse hold and uncheck the Highlighted checkbox.","title":"Unhighlighting Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhighlighting-all-areas","text":"To remove all highlighted, select Annotation > Clear Highlights . If you are interested in a particular area in the Main Display Pane, but don't know the where in the Attributes Table it is, left double-click on the area of interest and the corresponding row will be highlighted.","title":"Unhighlighting All Areas"},{"location":"cave/d2d-gis-shapefiles/#controlling-visibility-of-image-areas","text":"","title":"Controlling Visibility of Image Areas"},{"location":"cave/d2d-gis-shapefiles/#hiding-selected-areas","text":"To hide a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Visible checkbox. Hidden rows will be gray in the table and the corresponding area in the map display will disappear.","title":"Hiding Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhiding-selected-areas","text":"You can make these images visible by selecting the row, right mouse hold and check the Visible checkbox.","title":"Unhiding Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhiding-all-areas","text":"To make all images visible, select Annotation > Make All Visible .","title":"Unhiding All Areas"},{"location":"cave/d2d-gis-shapefiles/#configuring-attributes-table","text":"In the Attributes Table, you have the option to sort by columns and select which columns are displayed.","title":"Configuring Attributes Table"},{"location":"cave/d2d-gis-shapefiles/#selecting-columns-to-display","text":"By default, all available columns are displayed. The Select Columns dialog will pop-up if you select Data > Select Columns... . You can highlight the columns use the arrows to move them into the Available or Displayed columns. Clicking OK will update your table.","title":"Selecting Columns to Display"},{"location":"cave/d2d-gis-shapefiles/#sorting-column-information","text":"The Sort Order dialog will pop-up if you select Data > Sort... . You can use the drop down menu to choose the column to sort by and then sort by Ascending or Descending. You can sort by additional columns. Clicking OK will update your table.","title":"Sorting Column Information"},{"location":"cave/d2d-gis-shapefiles/#labeling-gis-data","text":"You can select which attribute you want to use to label the objects on the Main Display. To open the Label submenu, right click and hold on the Map Product ID in the Legend area to open a pop-up menu and select Label and choose which attribute you want as the label.","title":"Labeling GIS Data"},{"location":"cave/d2d-gridded-models/","text":"Volume Browser \uf0c1 The Volume Browser provides access to numerical models, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser can be accessed from either the Tools (alphabetically organized) or Models (first option) menus. Visual Overview \uf0c1 The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons (Diff and Load) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser. Volume Browser Menu Bar \uf0c1 The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Loop Types VB Tools \uf0c1 Baselines \uf0c1 Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you are working with a baseline, a second click with B3 will return you to the original baseline, even if you modified another baseline. Points \uf0c1 Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Choose By ID \uf0c1 Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. Display Types \uf0c1 Plan View (default) \uf0c1 This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space. Cross Section \uf0c1 Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\") Time Height \uf0c1 Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution. Var vs Hgt \uf0c1 Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data. Sounding \uf0c1 Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen. Time Series \uf0c1 Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point. Loop Types \uf0c1 Time \uf0c1 The default option for the Volume Browser. It allows you to view model data through time Space \uf0c1 Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.","title":"Volume Browser"},{"location":"cave/d2d-gridded-models/#volume-browser","text":"The Volume Browser provides access to numerical models, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser can be accessed from either the Tools (alphabetically organized) or Models (first option) menus.","title":"Volume Browser"},{"location":"cave/d2d-gridded-models/#visual-overview","text":"The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons (Diff and Load) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser.","title":"Visual Overview"},{"location":"cave/d2d-gridded-models/#volume-browser-menu-bar","text":"The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Loop Types","title":"Volume Browser Menu Bar"},{"location":"cave/d2d-gridded-models/#vb-tools","text":"","title":"VB Tools"},{"location":"cave/d2d-gridded-models/#baselines","text":"Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you are working with a baseline, a second click with B3 will return you to the original baseline, even if you modified another baseline.","title":"Baselines"},{"location":"cave/d2d-gridded-models/#points","text":"Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use.","title":"Points"},{"location":"cave/d2d-gridded-models/#choose-by-id","text":"Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations.","title":"Choose By ID"},{"location":"cave/d2d-gridded-models/#display-types","text":"","title":"Display Types"},{"location":"cave/d2d-gridded-models/#plan-view-default","text":"This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space.","title":"Plan View (default)"},{"location":"cave/d2d-gridded-models/#cross-section","text":"Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\")","title":"Cross Section"},{"location":"cave/d2d-gridded-models/#time-height","text":"Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution.","title":"Time Height"},{"location":"cave/d2d-gridded-models/#var-vs-hgt","text":"Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data.","title":"Var vs Hgt"},{"location":"cave/d2d-gridded-models/#sounding","text":"Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen.","title":"Sounding"},{"location":"cave/d2d-gridded-models/#time-series","text":"Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point.","title":"Time Series"},{"location":"cave/d2d-gridded-models/#loop-types","text":"","title":"Loop Types"},{"location":"cave/d2d-gridded-models/#time","text":"The default option for the Volume Browser. It allows you to view model data through time","title":"Time"},{"location":"cave/d2d-gridded-models/#space","text":"Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.","title":"Space"},{"location":"cave/d2d-grids/","text":"MSLP and Precipitation \uf0c1 Sfc Temperature and Wind \uf0c1 Sfc Dewpoint Temperature \uf0c1 Sfc Relative Humidity \uf0c1 30mb Mean Dewpoint \uf0c1 Precipitable Water \uf0c1 Simulated Reflectivity (REFC) \uf0c1 Lightning Threat \uf0c1 Precip Type / Moisture Transport \uf0c1 Vorticity (500mb) \uf0c1 Vertical Velocity (500mb, 700mb, 850mb) \uf0c1 Thickness / Vorticity Advection (Trenberth) \uf0c1 Wind / Height (850mb, 700mb, 500mb, 300mb, 250mb) \uf0c1 Potential Vorticity (250mb) \uf0c1 Helicity / Storm-Relative Flow \uf0c1 Hail Parameters \uf0c1 MCS Parameters \uf0c1 Isentopic Analysis (270K-320K) \uf0c1","title":"D2d grids"},{"location":"cave/d2d-grids/#mslp-and-precipitation","text":"","title":"MSLP and Precipitation"},{"location":"cave/d2d-grids/#sfc-temperature-and-wind","text":"","title":"Sfc Temperature and Wind"},{"location":"cave/d2d-grids/#sfc-dewpoint-temperature","text":"","title":"Sfc Dewpoint Temperature"},{"location":"cave/d2d-grids/#sfc-relative-humidity","text":"","title":"Sfc Relative Humidity"},{"location":"cave/d2d-grids/#30mb-mean-dewpoint","text":"","title":"30mb Mean Dewpoint"},{"location":"cave/d2d-grids/#precipitable-water","text":"","title":"Precipitable Water"},{"location":"cave/d2d-grids/#simulated-reflectivity-refc","text":"","title":"Simulated Reflectivity (REFC)"},{"location":"cave/d2d-grids/#lightning-threat","text":"","title":"Lightning Threat"},{"location":"cave/d2d-grids/#precip-type-moisture-transport","text":"","title":"Precip Type / Moisture Transport"},{"location":"cave/d2d-grids/#vorticity-500mb","text":"","title":"Vorticity (500mb)"},{"location":"cave/d2d-grids/#vertical-velocity-500mb-700mb-850mb","text":"","title":"Vertical Velocity (500mb, 700mb, 850mb)"},{"location":"cave/d2d-grids/#thickness-vorticity-advection-trenberth","text":"","title":"Thickness / Vorticity Advection (Trenberth)"},{"location":"cave/d2d-grids/#wind-height-850mb-700mb-500mb-300mb-250mb","text":"","title":"Wind / Height (850mb, 700mb, 500mb, 300mb, 250mb)"},{"location":"cave/d2d-grids/#potential-vorticity-250mb","text":"","title":"Potential Vorticity (250mb)"},{"location":"cave/d2d-grids/#helicity-storm-relative-flow","text":"","title":"Helicity / Storm-Relative Flow"},{"location":"cave/d2d-grids/#hail-parameters","text":"","title":"Hail Parameters"},{"location":"cave/d2d-grids/#mcs-parameters","text":"","title":"MCS Parameters"},{"location":"cave/d2d-grids/#isentopic-analysis-270k-320k","text":"","title":"Isentopic Analysis (270K-320K)"},{"location":"cave/d2d-hydro/","text":"The NCEP/Hydro menu contains nine sections: SPC, TPC, NCO, HPC, MPC, CPC, AWC, Hydro, and Local Analyses/Statistical Guidance. Each section is further subdivided into related products, as described below. For more information on hydro products, refer to documentation prepared by the NWS' Office of Hydrology. SPC \uf0c1 Storm Prediction Center (SPC) Watches, Severe Weather Plots, SPC Convective Outlooks, and Fire Weather information. Severe Weather Plots are extracted from the STADTS and STAHRY text products and plotted to time-match the current display. The Severe Weather Plots data set in the NCEP/Hydro Menu can be interrogated (sampled) for more detailed information by clicking mouse Button 1 (B1) over a site. TPC \uf0c1 Contains the hurricane submenu, which comprises graphic products that display the Marine/Tropical Cyclone Advisory (TCM), the Public Tropical Cyclone Advisory (TCP), hourly forecasts, and model guidance. HPC \uf0c1 Contains 6-hour QPF (Quantitative Precipitation Forecast) data plus the submenus, described below, for Precipitation and Temps & Weather products. Precipitation Contains probabilities of daily precipitation, precipitation accumulation, and probabilities of daily snowfall. In addition, this submenu enables you to display QPF projections for 1 to 3 days in 6 hour increments, 4 to 5 days in 48 hour increments, and 1 to 5 days in 120 hour increments. The HPC Excessive Rainfall product consists of a contour graphic and image of the excessive rainfall for day 1 (with forecast times of 21, 24, 27, or 30 hours), and days 2 and 3 (both with forecast times of 48 and 72 hours). The HPC product will update the selected forecast cycle twice per day. Temps & Weather Contains daily Max/Min temperature anomalies, daily heat index probabilities, and pressure and frontal analysis. MPC \uf0c1 Contains the Marine Guidance submenu, which includes marine analyses and model guidance. Note that the Marine Prediction Center (MPC) is now called the Ocean Prediction Center (OPC). CPC \uf0c1 Contains threat charts and outlook grids derived from these two submenus: Threat Charts Contains drought monitoring data, daily threats assessment, and daily heat index forecasts. Outlook Grids Contains temperature and precipitation probabilities. AWC \uf0c1 Contains CCFP (Collaborative Convective Forecast Product), an aviation product. Formerly located under the Aviation option on the Upper Air menu, CCFP is a strategic forecast of convection to guide traffic managers in their system-wide approach to managing traffic. The forecast suite consists of 3 forecast maps with selectable lead times (4, 6, and 8 hours). The forecasts are issued by the Aviation Weather Center (AWC) between March 1 and October 30, eleven times per day. CCFP is alpha-numeric information suitable for the graphical depiction of forecast areas of significant thunderstorms. The CCFP message covers the CONUS area, and includes information on the location of thunderstorm areas, and associated information such as storm tops, coverage, confidence, and direction/speed of movement. NCO \uf0c1 Contains Precip & Stability, Temps & Weather, National Centers model, NGM MOS (NGM-based MOS system), and the following Sounding-derived plots submenus. Precip & Stability : Contains precipitation, radar, and stability products. Temps & Weather : Contains Max/Min temperature, freezing level, weather depiction, and surface geostrophic wind and relative vorticity plots. National Centers Models : Contains model guidance from the National Centers Sounding-derived plots : Contains options to display model soundings (sometimes called \"BUFR soundings\" because they are packaged in BUFR format for transmission). These are soundings extracted directly from the model, including all levels not generated from the pressure-level grids used elsewhere in the system. Sounding Availability This option displays the sounding locations (shown with asterisks) available from the latest model run; typically these locations coincide with TAF (Terminal Aerodrome Forecast) locations. The plot will update with each model run. Because the sounding data is quite voluminous, only soundings over your State(s) scale are saved. Surface The Surface Plots, which mimic the METAR Surface Plots, are taken from the model-derived soundings and provide hourly forecast surface plots. Because you cannot see all forecast projections in a 32 frame loop (e.g., displaying the entire North American Model (NAM) or Global Forecasting System (GFS) run would require 61 frames), you will probably want to use the Time Options Tool (refer to Subsection 2.2.6.4) to view a subset of the forecast -- perhaps a continuous run of hours or every other hour for the whole run. Ceiling/Visibility The \"Ceil/Vis Plot\" shows weather (rain, frz rain, snow) on the right, a stack of three cloud layers above, and visibility below the METAR station. The cloud layers are defined as low (990mb-640mb), mid (640mb-350mb), and high (<350mb). Each cloud layer shows a coverage circle with clear, sct, bkn, and ovc options. Next to one of the circles, there may be a cloud base. The cloud base is sent as a pressure, but is plotted in hft MSL based on a Standard Atmosphere conversion. Because the cloud layers and the cloud base are generated from separate algorithms at NCEP (National Centers for Environmental Prediction), it is possible to have broken or overcast clouds indicated but no base; alternatively, the base may be shown with a high overcast, while ignoring a mid broken layer. Also, a cloud base is reported if convective precipitation is indicated, even for only 10-20% cloud cover. As a result, one can see a cloud base associated with scattered clouds. 1 Hr and 3 Hr Precip Amt This option shows hourly amounts for NAM and 3 hour intervals for GFS at each location. Cloud Layers This option displays the amount of low, middle, and high cloud cover, each as a standard sky coverage symbol, and weather type as a weather symbol. Hydro \uf0c1 Contains QPE, QPF, and RFC Flash Flood Guidance submenus. Hydro Applications, such as HydroView and MPE Editor, are loaded from the Perspectives dialog (Hydro and MPE, respectively) or from the HydroApps menu in the Hydro(View) Perspective (Hydrobase, RiverPro, XDAT, Forecast Service, River Monitor, Precip Monitor, SSHP, and Dam Catalog). QPE : Makes available mosaic images of RFC-generated Quantitative Precipitation Estimator (QPE) and the Multisensor Precipitation Estimator (MPE) grids, which are displayed using a 'truncated' grid color table that shows zero values in gray to let you see the limits of the site-specified domain. These mosaic images are generated by the RFCs in 1, 6, and 24 hour cycles. The MPE grids can be displayed as local contours or images. NESDIS produces two types of Satellite Precipitation Estimates (SPE) based on GOES (Geostationary Operational Environmental Satellite) imagery series: Auto SPEs and Manual SPEs. Auto SPEs, which can be displayed directly from the QPE submenu, are produced hourly based on the most recent one-hour series of IR GOES imagery. This product is displayable on any AWIPS scale. The Auto SPE estimates are displayed in units of inches of precipitation that fell during the specified one hour period. Manual SPEs are accessible through the Manual SPE submenu. You can access the Manual SPE submenu from the QPE submenu. Generation of these products requires substantial manual intervention by NESDIS personnel; consequently, these products are generated and distributed to AWIPS at variable frequencies, as significant precipitation events warrant (i.e., their frequency is variable). The duration (or valid period) of the Manual SPEs is also variable. Whereas the duration of Auto SPEs is always one hour, the duration of the Manual SPEs ranges from 1 to 12 hours. Furthermore, although each Manual SPE product is mapped to a CONUS grid, the area of analysis is usually regional (focusing on the significant precipitation event). Apart from these important differences, the Manual SPEs are very similar to the Auto SPEs. QPF : Displays QPF, which indicate how much precipitation will occur in a particular grid. QPFs, which are issued by the RFCs, display as contours by default. However, from the pop-up menu you can convert them to image form. RFC Flash Flood Guidance : Displays County and Zone Flash Flood Guidance (FFG) grids on any scale. The area for which the data is displayed is limited, but the site system manager may configure a larger area. In addition, 1h, 3h, and 6h mosaic RFC-generated FFG grids can be displayed for both local and other RFC locations. Local Analyses/Statistical Guidance \uf0c1 Model Output Statistical (MOS) plots derived from the MOS BUFR and Text Bulletins display forecast data for GFS MOS, GFS-Extended MOS, Eta MOS, and NGM MOS. The plots are accessed by selecting NGM or GFS-LAMP/MOS forecasts under the Local Analyses/Statistical Guidance option.","title":"D2d hydro"},{"location":"cave/d2d-hydro/#spc","text":"Storm Prediction Center (SPC) Watches, Severe Weather Plots, SPC Convective Outlooks, and Fire Weather information. Severe Weather Plots are extracted from the STADTS and STAHRY text products and plotted to time-match the current display. The Severe Weather Plots data set in the NCEP/Hydro Menu can be interrogated (sampled) for more detailed information by clicking mouse Button 1 (B1) over a site.","title":"SPC"},{"location":"cave/d2d-hydro/#tpc","text":"Contains the hurricane submenu, which comprises graphic products that display the Marine/Tropical Cyclone Advisory (TCM), the Public Tropical Cyclone Advisory (TCP), hourly forecasts, and model guidance.","title":"TPC"},{"location":"cave/d2d-hydro/#hpc","text":"Contains 6-hour QPF (Quantitative Precipitation Forecast) data plus the submenus, described below, for Precipitation and Temps & Weather products. Precipitation Contains probabilities of daily precipitation, precipitation accumulation, and probabilities of daily snowfall. In addition, this submenu enables you to display QPF projections for 1 to 3 days in 6 hour increments, 4 to 5 days in 48 hour increments, and 1 to 5 days in 120 hour increments. The HPC Excessive Rainfall product consists of a contour graphic and image of the excessive rainfall for day 1 (with forecast times of 21, 24, 27, or 30 hours), and days 2 and 3 (both with forecast times of 48 and 72 hours). The HPC product will update the selected forecast cycle twice per day. Temps & Weather Contains daily Max/Min temperature anomalies, daily heat index probabilities, and pressure and frontal analysis.","title":"HPC"},{"location":"cave/d2d-hydro/#mpc","text":"Contains the Marine Guidance submenu, which includes marine analyses and model guidance. Note that the Marine Prediction Center (MPC) is now called the Ocean Prediction Center (OPC).","title":"MPC"},{"location":"cave/d2d-hydro/#cpc","text":"Contains threat charts and outlook grids derived from these two submenus: Threat Charts Contains drought monitoring data, daily threats assessment, and daily heat index forecasts. Outlook Grids Contains temperature and precipitation probabilities.","title":"CPC"},{"location":"cave/d2d-hydro/#awc","text":"Contains CCFP (Collaborative Convective Forecast Product), an aviation product. Formerly located under the Aviation option on the Upper Air menu, CCFP is a strategic forecast of convection to guide traffic managers in their system-wide approach to managing traffic. The forecast suite consists of 3 forecast maps with selectable lead times (4, 6, and 8 hours). The forecasts are issued by the Aviation Weather Center (AWC) between March 1 and October 30, eleven times per day. CCFP is alpha-numeric information suitable for the graphical depiction of forecast areas of significant thunderstorms. The CCFP message covers the CONUS area, and includes information on the location of thunderstorm areas, and associated information such as storm tops, coverage, confidence, and direction/speed of movement.","title":"AWC"},{"location":"cave/d2d-hydro/#nco","text":"Contains Precip & Stability, Temps & Weather, National Centers model, NGM MOS (NGM-based MOS system), and the following Sounding-derived plots submenus. Precip & Stability : Contains precipitation, radar, and stability products. Temps & Weather : Contains Max/Min temperature, freezing level, weather depiction, and surface geostrophic wind and relative vorticity plots. National Centers Models : Contains model guidance from the National Centers Sounding-derived plots : Contains options to display model soundings (sometimes called \"BUFR soundings\" because they are packaged in BUFR format for transmission). These are soundings extracted directly from the model, including all levels not generated from the pressure-level grids used elsewhere in the system. Sounding Availability This option displays the sounding locations (shown with asterisks) available from the latest model run; typically these locations coincide with TAF (Terminal Aerodrome Forecast) locations. The plot will update with each model run. Because the sounding data is quite voluminous, only soundings over your State(s) scale are saved. Surface The Surface Plots, which mimic the METAR Surface Plots, are taken from the model-derived soundings and provide hourly forecast surface plots. Because you cannot see all forecast projections in a 32 frame loop (e.g., displaying the entire North American Model (NAM) or Global Forecasting System (GFS) run would require 61 frames), you will probably want to use the Time Options Tool (refer to Subsection 2.2.6.4) to view a subset of the forecast -- perhaps a continuous run of hours or every other hour for the whole run. Ceiling/Visibility The \"Ceil/Vis Plot\" shows weather (rain, frz rain, snow) on the right, a stack of three cloud layers above, and visibility below the METAR station. The cloud layers are defined as low (990mb-640mb), mid (640mb-350mb), and high (<350mb). Each cloud layer shows a coverage circle with clear, sct, bkn, and ovc options. Next to one of the circles, there may be a cloud base. The cloud base is sent as a pressure, but is plotted in hft MSL based on a Standard Atmosphere conversion. Because the cloud layers and the cloud base are generated from separate algorithms at NCEP (National Centers for Environmental Prediction), it is possible to have broken or overcast clouds indicated but no base; alternatively, the base may be shown with a high overcast, while ignoring a mid broken layer. Also, a cloud base is reported if convective precipitation is indicated, even for only 10-20% cloud cover. As a result, one can see a cloud base associated with scattered clouds. 1 Hr and 3 Hr Precip Amt This option shows hourly amounts for NAM and 3 hour intervals for GFS at each location. Cloud Layers This option displays the amount of low, middle, and high cloud cover, each as a standard sky coverage symbol, and weather type as a weather symbol.","title":"NCO"},{"location":"cave/d2d-hydro/#hydro","text":"Contains QPE, QPF, and RFC Flash Flood Guidance submenus. Hydro Applications, such as HydroView and MPE Editor, are loaded from the Perspectives dialog (Hydro and MPE, respectively) or from the HydroApps menu in the Hydro(View) Perspective (Hydrobase, RiverPro, XDAT, Forecast Service, River Monitor, Precip Monitor, SSHP, and Dam Catalog). QPE : Makes available mosaic images of RFC-generated Quantitative Precipitation Estimator (QPE) and the Multisensor Precipitation Estimator (MPE) grids, which are displayed using a 'truncated' grid color table that shows zero values in gray to let you see the limits of the site-specified domain. These mosaic images are generated by the RFCs in 1, 6, and 24 hour cycles. The MPE grids can be displayed as local contours or images. NESDIS produces two types of Satellite Precipitation Estimates (SPE) based on GOES (Geostationary Operational Environmental Satellite) imagery series: Auto SPEs and Manual SPEs. Auto SPEs, which can be displayed directly from the QPE submenu, are produced hourly based on the most recent one-hour series of IR GOES imagery. This product is displayable on any AWIPS scale. The Auto SPE estimates are displayed in units of inches of precipitation that fell during the specified one hour period. Manual SPEs are accessible through the Manual SPE submenu. You can access the Manual SPE submenu from the QPE submenu. Generation of these products requires substantial manual intervention by NESDIS personnel; consequently, these products are generated and distributed to AWIPS at variable frequencies, as significant precipitation events warrant (i.e., their frequency is variable). The duration (or valid period) of the Manual SPEs is also variable. Whereas the duration of Auto SPEs is always one hour, the duration of the Manual SPEs ranges from 1 to 12 hours. Furthermore, although each Manual SPE product is mapped to a CONUS grid, the area of analysis is usually regional (focusing on the significant precipitation event). Apart from these important differences, the Manual SPEs are very similar to the Auto SPEs. QPF : Displays QPF, which indicate how much precipitation will occur in a particular grid. QPFs, which are issued by the RFCs, display as contours by default. However, from the pop-up menu you can convert them to image form. RFC Flash Flood Guidance : Displays County and Zone Flash Flood Guidance (FFG) grids on any scale. The area for which the data is displayed is limited, but the site system manager may configure a larger area. In addition, 1h, 3h, and 6h mosaic RFC-generated FFG grids can be displayed for both local and other RFC locations.","title":"Hydro"},{"location":"cave/d2d-hydro/#local-analysesstatistical-guidance","text":"Model Output Statistical (MOS) plots derived from the MOS BUFR and Text Bulletins display forecast data for GFS MOS, GFS-Extended MOS, Eta MOS, and NGM MOS. The plots are accessed by selecting NGM or GFS-LAMP/MOS forecasts under the Local Analyses/Statistical Guidance option.","title":"Local Analyses/Statistical Guidance"},{"location":"cave/d2d-map-resources/","text":"These programs are accessible though the Maps dropdown menu. Interstates Interstates and US Highways Warning Areas (with station identifier) WSR-88D Station Locations","title":"D2d map resources"},{"location":"cave/d2d-perspective/","text":"D2D Perspective \uf0c1 D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ). Resource Stack \uf0c1 At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products. Left-Click Resource Name to Hide \uf0c1 A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible. Right-Click Background to Cycle Resource Views \uf0c1 The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources. Hold-Right-Click Resource Name for Menu \uf0c1 Drag the mouse over a loaded resource and hold the right mouse button until a menu appears. The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. This menu also gives you the option to unload this specific product , as opposed to removing all data prodcuts. Simply select the Unload option at the bottom of the resource's hold-right-click menu. Display Menu \uf0c1 The display menu has many options which can alter the functionality in CAVE. Hold-Right-Click Background for Display Menu \uf0c1 Holding down the right mouse button anywhere in the map view will open a right-click menu Show Map Legends \uf0c1 From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view. Sample Loaded Resources \uf0c1 Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu. Toggle 2 or 4-Panel Layout \uf0c1 Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view. Notice the readout is at the same position in both panels. Any mouse movement made on one panel will be made on the other. By default, loading any data will load that data onto both panels. However, there is the option to specify which panel you would like to load data into, which can be useful if you want to have different data in each of the panels. To access this option, simple hold-right click to pull up the Display menu and choose Load to This Panel as shown below: Now, a yellow L will appear in the lower left hand corner of the panel you selected to load data to. When data is loaded from the menus it will only load to the display desginated with the L. Switch back to loading in both panels, by using the Load to All Panels option in the Display Menu. From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four). Unload Data \uf0c1 Select Unload All Products to remove all loaded graphic and image products from the display and start fresh. Select Unload Graphics to remove all but the image products. Product Browser \uf0c1 The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Browsers > Product Browser . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available. Options Menu \uf0c1 There are several toggle options and options dialogs that are available under the Options menu found at the top of the application. Time Options (Ctrl + T) \uf0c1 This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select Offset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will be displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog box appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how far off it is. Image Combination (Insert) \uf0c1 This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable. Display Properties \uf0c1 This menu option opens the Display Properties dialog box. Most of the options available in this dialog box are also available on the Toolbar , while the rest are available in the individual resource menus if that resource uses these properties. Loop Properties (Ctrl + L) \uf0c1 Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping. Image Properties (Ctrl + I) \uf0c1 The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data. Set Time \uf0c1 This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data. Set Background Color \uf0c1 You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3). Switching Perspectives \uf0c1 Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. Note : The National Centers Perspective (which is available in the Other... submenu) is available on the Linux version of CAVE. And the GFE perspective is not available on the Windows version. CAVE Preferences \uf0c1 Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname. Load Mode \uf0c1 Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of frames. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display pane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already loaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you want. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.","title":"D2D Perspective"},{"location":"cave/d2d-perspective/#d2d-perspective","text":"D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ).","title":"D2D Perspective"},{"location":"cave/d2d-perspective/#resource-stack","text":"At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products.","title":"Resource Stack"},{"location":"cave/d2d-perspective/#left-click-resource-name-to-hide","text":"A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.","title":"Left-Click Resource Name to Hide"},{"location":"cave/d2d-perspective/#right-click-background-to-cycle-resource-views","text":"The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources.","title":"Right-Click Background to Cycle Resource Views"},{"location":"cave/d2d-perspective/#hold-right-click-resource-name-for-menu","text":"Drag the mouse over a loaded resource and hold the right mouse button until a menu appears. The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. This menu also gives you the option to unload this specific product , as opposed to removing all data prodcuts. Simply select the Unload option at the bottom of the resource's hold-right-click menu.","title":"Hold-Right-Click Resource Name for Menu"},{"location":"cave/d2d-perspective/#display-menu","text":"The display menu has many options which can alter the functionality in CAVE.","title":"Display Menu"},{"location":"cave/d2d-perspective/#hold-right-click-background-for-display-menu","text":"Holding down the right mouse button anywhere in the map view will open a right-click menu","title":"Hold-Right-Click Background for Display Menu"},{"location":"cave/d2d-perspective/#show-map-legends","text":"From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view.","title":"Show Map Legends"},{"location":"cave/d2d-perspective/#sample-loaded-resources","text":"Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu.","title":"Sample Loaded Resources"},{"location":"cave/d2d-perspective/#toggle-2-or-4-panel-layout","text":"Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view. Notice the readout is at the same position in both panels. Any mouse movement made on one panel will be made on the other. By default, loading any data will load that data onto both panels. However, there is the option to specify which panel you would like to load data into, which can be useful if you want to have different data in each of the panels. To access this option, simple hold-right click to pull up the Display menu and choose Load to This Panel as shown below: Now, a yellow L will appear in the lower left hand corner of the panel you selected to load data to. When data is loaded from the menus it will only load to the display desginated with the L. Switch back to loading in both panels, by using the Load to All Panels option in the Display Menu. From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four).","title":"Toggle 2 or 4-Panel Layout"},{"location":"cave/d2d-perspective/#unload-data","text":"Select Unload All Products to remove all loaded graphic and image products from the display and start fresh. Select Unload Graphics to remove all but the image products.","title":"Unload Data"},{"location":"cave/d2d-perspective/#product-browser","text":"The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Browsers > Product Browser . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available.","title":"Product Browser"},{"location":"cave/d2d-perspective/#options-menu","text":"There are several toggle options and options dialogs that are available under the Options menu found at the top of the application.","title":"Options Menu"},{"location":"cave/d2d-perspective/#time-options-ctrl-t","text":"This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select Offset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will be displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog box appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how far off it is.","title":"Time Options (Ctrl + T)"},{"location":"cave/d2d-perspective/#image-combination-insert","text":"This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable.","title":"Image Combination (Insert)"},{"location":"cave/d2d-perspective/#display-properties","text":"This menu option opens the Display Properties dialog box. Most of the options available in this dialog box are also available on the Toolbar , while the rest are available in the individual resource menus if that resource uses these properties.","title":"Display Properties"},{"location":"cave/d2d-perspective/#loop-properties-ctrl-l","text":"Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping.","title":"Loop Properties (Ctrl + L)"},{"location":"cave/d2d-perspective/#image-properties-ctrl-i","text":"The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data.","title":"Image Properties (Ctrl + I)"},{"location":"cave/d2d-perspective/#set-time","text":"This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data.","title":"Set Time"},{"location":"cave/d2d-perspective/#set-background-color","text":"You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3).","title":"Set Background Color"},{"location":"cave/d2d-perspective/#switching-perspectives","text":"Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. Note : The National Centers Perspective (which is available in the Other... submenu) is available on the Linux version of CAVE. And the GFE perspective is not available on the Windows version.","title":"Switching Perspectives"},{"location":"cave/d2d-perspective/#cave-preferences","text":"Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname.","title":"CAVE Preferences"},{"location":"cave/d2d-perspective/#load-mode","text":"Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of frames. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display pane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already loaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you want. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.","title":"Load Mode"},{"location":"cave/d2d-pointdata-surface-obs/","text":"Several of the data sets in the Obs menu can be interrogated (sampled) for more detailed information by clicking mouse Button 1 (B1) over a site. These data sets include METAR, Maritime, and Local. The Obs menu is subdivided into sections that contain related products. These sections are described below. METAR \uf0c1 This section contains automatically updating METAR observations, ceiling and visibility plots, wind chill and heat indices, precipitation plots at various time intervals, and quality-checked MSAS observations. The 24hr Chg METAR plot provides the difference between the observed temperature, dewpoint, pressure, and wind from those observed 24 hours earlier. The calculation of the wind difference involves vector subtraction of the \"u\" and \"v\" components. Synoptic \uf0c1 This section contains automatically updating Synoptic observations, and 6 hour and 24 hour precip plots. Note that this section of the menu is not present at most sites. Maritime \uf0c1 This section contains buoy and ship report plots, plus SAFESEAS for the Marine WFOs. MAROB displays include Station Plots The Other Maritime Plots cascading menu contains options to display the Fixed and Moving Sea State plots, MAROB Sea State and Cloud/Vis plots, Maritime Clouds/Visibility plots, as well as the Scatterometer Winds. Sea State plots provide information on the wave period and height and swell period and height. The wave type, whether a standard wave or a wind wave, is denoted at the origin of the plot by a \"+\" or a \"w\", respectively. An \"x\" at the plot origin signifies that no wave type was reported. If reported, the directions of the primary and secondary swells are denoted with arrows labeled \"1\" and \"2\", respectively. The arrows point in the direction the swell is moving. Maritime Clouds/Visibility plots contain a station circle denoting sky coverage and the visibility along with standard symbols for obstructions to visibility. Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOp-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT Winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) can be launched from either the CAVE Obs menu or from the Satellite menu You can access the Scatterometer Winds menu options by selecting Surface > Other Maritime Plots > Scatterometer Winds . The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales: N. Hemisphere, North America, CONUS, Regional, State(s), and WFO. Local Storm Reports : Local Storm Report (LSR) plots are generated from spotter reports that were entered into the LSR text database and decoded into the correct point data format. The LSR graphical user interface (GUI) is a stand-alone AWIPS application designed to provide forecasters with an easy and quick way to create, manage, and send the LSR public text product. This text product contains noteworthy weather events for which the forecaster has either received or sought out real-time observations. National Convective Weather Forecast (AWC) \uf0c1 The National Convective Weather Forecast (NCWF) is an automatically generated depiction of current convection and extrapolated significant current convection. It is a supplement to, but does not substitute for, the report and forecast information contained in Convective SIGMETs . The NCWF contains both GRIB and BUFR output. The GRIB output delineates the current convection. The BUFR output includes hazardous convection area polygons, movement arrows, and storm top and speed text information. The NCWF display bunlde renders storm tops and movement , previous performance polygons , 1-hour extrapolation polygons , and current convective interest grid (colorbar). Center Weather Advisories (CWA) \uf0c1 The CWA is an aviation weather warning for conditions meeting or approaching national in-flight advisory (AIRMET, SIGMET or SIGMET for convection) criteria. The CWA is primarily used by air crews to anticipate and avoid adverse weather conditions in the en route and terminal environments. It is not a flight planning product because of its short lead time and duration. Shown with NEXRAD DHR composite: MOS Products \uf0c1 These plots are derived from the MOS BUFR Bulletins. The previous MOS plots were derived from the MOS Text Bulletins. The plots display forecast data for GFS MOS, GFS-Extended MOS, and NGM MOS. Submenus under each model reveal the element choices. These displays include: Station Model Plots (Wind, T, Td, Sky Cover, Wx) MaxT/MinT (\u00b0F) Ceiling (agl) / Visibility (ft \u00d7 100) (Categorical) Probabilities Submenu (6h/12h PoP, 6h/12h Tstorm, 6h/12h Svr-Tstorm, Conditional precipitation types; %) QPF 12h (Categorical mid-points; inches) QPF 6h (Categorical mid-points; inches) Snowfall (6h/12h/24h, Categorical; inches) Lightning \uf0c1 This menu item provides three options for displaying lightning flash plots over specified 1 minute, 5 minute, 15 minute and 1 hour intervals. USPLN (United States Precision Lightning Network) : WSI Corporation USPLN lightning data has been made available exclusively to universities for education and research use. Unidata serves USPLN lightning stroke data from the LIGHTNING LDM data feed. Registration is required to request this data, and the free feed is available on an annually renewed basis. USPLN data is not available to the public. NLDN (National Lightning Detection Network) : The NLDN option plots cloud-to-ground (CG) lightning flashes for specified time intervals across the continental United States. NLDN lightning data can be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). GLD (Global Lightning Dataset) : The GLD option plots cloud-to-ground (CG) lightning flashes for specified time intervals on a global-scale. GLD lightning data can also be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). ENI Total Lightning : In addition to displaying CG lightning flashes, the Total Lightning option also displays Cloud Flash (CF) lightning and Pulses. CF lightning are lightning flashes which do not strike the ground such as in-cloud, cloud-to-cloud, and cloud-to-air lightning. Lightning pulses are electromagnetic pulses that radiate outward from the lightning channel. ENI total lightning data can be displayed as a grid image displaying the cloud-to-ground, cloud flash, and lightning pulse density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km).","title":"D2d pointdata surface obs"},{"location":"cave/d2d-pointdata-surface-obs/#metar","text":"This section contains automatically updating METAR observations, ceiling and visibility plots, wind chill and heat indices, precipitation plots at various time intervals, and quality-checked MSAS observations. The 24hr Chg METAR plot provides the difference between the observed temperature, dewpoint, pressure, and wind from those observed 24 hours earlier. The calculation of the wind difference involves vector subtraction of the \"u\" and \"v\" components.","title":"METAR"},{"location":"cave/d2d-pointdata-surface-obs/#synoptic","text":"This section contains automatically updating Synoptic observations, and 6 hour and 24 hour precip plots. Note that this section of the menu is not present at most sites.","title":"Synoptic"},{"location":"cave/d2d-pointdata-surface-obs/#maritime","text":"This section contains buoy and ship report plots, plus SAFESEAS for the Marine WFOs. MAROB displays include Station Plots The Other Maritime Plots cascading menu contains options to display the Fixed and Moving Sea State plots, MAROB Sea State and Cloud/Vis plots, Maritime Clouds/Visibility plots, as well as the Scatterometer Winds. Sea State plots provide information on the wave period and height and swell period and height. The wave type, whether a standard wave or a wind wave, is denoted at the origin of the plot by a \"+\" or a \"w\", respectively. An \"x\" at the plot origin signifies that no wave type was reported. If reported, the directions of the primary and secondary swells are denoted with arrows labeled \"1\" and \"2\", respectively. The arrows point in the direction the swell is moving. Maritime Clouds/Visibility plots contain a station circle denoting sky coverage and the visibility along with standard symbols for obstructions to visibility. Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOp-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT Winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) can be launched from either the CAVE Obs menu or from the Satellite menu You can access the Scatterometer Winds menu options by selecting Surface > Other Maritime Plots > Scatterometer Winds . The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales: N. Hemisphere, North America, CONUS, Regional, State(s), and WFO. Local Storm Reports : Local Storm Report (LSR) plots are generated from spotter reports that were entered into the LSR text database and decoded into the correct point data format. The LSR graphical user interface (GUI) is a stand-alone AWIPS application designed to provide forecasters with an easy and quick way to create, manage, and send the LSR public text product. This text product contains noteworthy weather events for which the forecaster has either received or sought out real-time observations.","title":"Maritime"},{"location":"cave/d2d-pointdata-surface-obs/#national-convective-weather-forecast-awc","text":"The National Convective Weather Forecast (NCWF) is an automatically generated depiction of current convection and extrapolated significant current convection. It is a supplement to, but does not substitute for, the report and forecast information contained in Convective SIGMETs . The NCWF contains both GRIB and BUFR output. The GRIB output delineates the current convection. The BUFR output includes hazardous convection area polygons, movement arrows, and storm top and speed text information. The NCWF display bunlde renders storm tops and movement , previous performance polygons , 1-hour extrapolation polygons , and current convective interest grid (colorbar).","title":"National Convective Weather Forecast (AWC)"},{"location":"cave/d2d-pointdata-surface-obs/#center-weather-advisories-cwa","text":"The CWA is an aviation weather warning for conditions meeting or approaching national in-flight advisory (AIRMET, SIGMET or SIGMET for convection) criteria. The CWA is primarily used by air crews to anticipate and avoid adverse weather conditions in the en route and terminal environments. It is not a flight planning product because of its short lead time and duration. Shown with NEXRAD DHR composite:","title":"Center Weather Advisories (CWA)"},{"location":"cave/d2d-pointdata-surface-obs/#mos-products","text":"These plots are derived from the MOS BUFR Bulletins. The previous MOS plots were derived from the MOS Text Bulletins. The plots display forecast data for GFS MOS, GFS-Extended MOS, and NGM MOS. Submenus under each model reveal the element choices. These displays include: Station Model Plots (Wind, T, Td, Sky Cover, Wx) MaxT/MinT (\u00b0F) Ceiling (agl) / Visibility (ft \u00d7 100) (Categorical) Probabilities Submenu (6h/12h PoP, 6h/12h Tstorm, 6h/12h Svr-Tstorm, Conditional precipitation types; %) QPF 12h (Categorical mid-points; inches) QPF 6h (Categorical mid-points; inches) Snowfall (6h/12h/24h, Categorical; inches)","title":"MOS Products"},{"location":"cave/d2d-pointdata-surface-obs/#lightning","text":"This menu item provides three options for displaying lightning flash plots over specified 1 minute, 5 minute, 15 minute and 1 hour intervals. USPLN (United States Precision Lightning Network) : WSI Corporation USPLN lightning data has been made available exclusively to universities for education and research use. Unidata serves USPLN lightning stroke data from the LIGHTNING LDM data feed. Registration is required to request this data, and the free feed is available on an annually renewed basis. USPLN data is not available to the public. NLDN (National Lightning Detection Network) : The NLDN option plots cloud-to-ground (CG) lightning flashes for specified time intervals across the continental United States. NLDN lightning data can be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). GLD (Global Lightning Dataset) : The GLD option plots cloud-to-ground (CG) lightning flashes for specified time intervals on a global-scale. GLD lightning data can also be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). ENI Total Lightning : In addition to displaying CG lightning flashes, the Total Lightning option also displays Cloud Flash (CF) lightning and Pulses. CF lightning are lightning flashes which do not strike the ground such as in-cloud, cloud-to-cloud, and cloud-to-air lightning. Lightning pulses are electromagnetic pulses that radiate outward from the lightning channel. ENI total lightning data can be displayed as a grid image displaying the cloud-to-ground, cloud flash, and lightning pulse density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km).","title":"Lightning"},{"location":"cave/d2d-radar-tools/","text":"Radar Tools \uf0c1 The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus. Estimated Actual Velocity (EAV) \uf0c1 A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed. Radar Display Controls \uf0c1 The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options: STI (Storm Track Information) \uf0c1 This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed. HI (Hail Index) \uf0c1 This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. TVS (Tornado Vortex Signature) \uf0c1 There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product. DMD, MD, TVS \uf0c1 There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS. DMD (Digital Mesocyclone Display) \uf0c1 Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks. MBA (Microburst Alert) \uf0c1 Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts. SRM (Storm Relative Motion) \uf0c1 The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion. SAILS (Supplemental Adaptive Intra-Volume Low Level Scan) \uf0c1 Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt. VR - Shear \uf0c1 This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"Radar Tools"},{"location":"cave/d2d-radar-tools/#radar-tools","text":"The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus.","title":"Radar Tools"},{"location":"cave/d2d-radar-tools/#estimated-actual-velocity-eav","text":"A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.","title":"Estimated Actual Velocity (EAV)"},{"location":"cave/d2d-radar-tools/#radar-display-controls","text":"The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options:","title":"Radar Display Controls"},{"location":"cave/d2d-radar-tools/#sti-storm-track-information","text":"This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed.","title":"STI (Storm Track Information)"},{"location":"cave/d2d-radar-tools/#hi-hail-index","text":"This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50.","title":"HI (Hail Index)"},{"location":"cave/d2d-radar-tools/#tvs-tornado-vortex-signature","text":"There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product.","title":"TVS (Tornado Vortex Signature)"},{"location":"cave/d2d-radar-tools/#dmd-md-tvs","text":"There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS.","title":"DMD, MD, TVS"},{"location":"cave/d2d-radar-tools/#dmd-digital-mesocyclone-display","text":"Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks.","title":"DMD (Digital Mesocyclone Display)"},{"location":"cave/d2d-radar-tools/#mba-microburst-alert","text":"Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts.","title":"MBA (Microburst Alert)"},{"location":"cave/d2d-radar-tools/#srm-storm-relative-motion","text":"The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion.","title":"SRM (Storm Relative Motion)"},{"location":"cave/d2d-radar-tools/#sails-supplemental-adaptive-intra-volume-low-level-scan","text":"Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt.","title":"SAILS (Supplemental Adaptive Intra-Volume Low Level Scan)"},{"location":"cave/d2d-radar-tools/#vr-shear","text":"This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"VR - Shear"},{"location":"cave/d2d-radar/","text":"NEXRAD Radar Display \uf0c1 The Unidata D2D Perspective features a selectable NEXRAD station display over a loop of the FNEXRAD Digital Hybrid Reflectivity product. Selecting any station will open a two-panel reflectivity and velocity view for the selected station. NEXRAD & TDWR Station Menus \uf0c1 Individual NEXRAD station menus are accessible in Radar > NEXRAD Stations and are grouped alphabetically for a condensed submenu structure. With only the NEXRAD3 feedtype (NEXAD2 being disabled), notice that only some of the menu items will out with available data. Best Res Z+SRM8 / Z+V \uf0c1 The radar combination products Z+SRM and Z+V are precombined formats of the reflectivity and storm relative motion or velocity, displayed together via a single menu selection. SRM products include the storm motion vector information, which is plotted in the upper left corner of the Main Display Pane. 4-panel Z+SRM, ZDR+V, KDP+HC, CC+SW \uf0c1 4-panel Z, ZDR, HC+KDP, CC \uf0c1 This section enables you to load multiple base and dual-pol products, which are then simultaneously displayed. The label of this section of the menu describes the format for loading the products: Z+SRM in the upper left quadrant, ZDR+V in the upper right quadrant, KDP+HC in the lower left quadrant, and CC+SW in the lower right quadrant. Primary dual-pol base data analysis is best accomplished using the All Tilts base data option (4 panel all tilts with 8 products loaded), though you may use the single tilts (e.g., 0.5 base data) for longer time duration loops. To load 4 panel displays containing multiple elevation angles of the same product, you would select the four panel option and then select the desired set of 4 panels from the four panel submenu. All Tilts allows you to step or animate in either space or time. Selecting one of the All Tilts buttons will load all the tilts available from the latest volume scan. It will continue to load tilts from previous volume scans until it has loaded as many frames as indicated on the frame count menu. Auto updates will add higher tilts from the latest volume scan, replacing a tilt from the oldest volume. After loading an All Tilts display, Shift + LEFT ARROW and Shift + RIGHT ARROW and looping will take you through the frames in the order in which the system loaded them (without regard to volume scan or tilt). The UP ARROW and DOWN ARROW will step the display up or down in a volume scan allowing the tilts to change for a fixed time. The RIGHT ARROW and LEFT ARROW will step the display forward or backward through time at a fixed tilt. Once you have set the mode of motion (vertical or time), the Page Up/Page Down keys will start and adjust loop speed. To switch from vertical to time mode or from time to vertical mode, press the desired arrow key. If you hit the up or down arrow key in a standard (not All-Tilts) display, looping and stepping are disabled until you hit either the left or right arrow key or one of the stepping buttons on the menu. Once an arrow key (Left, Right, Up, Down) has been pressed, the stepping/animation controls on the main window toolbar and the Page Up/Page Down keys will function in that same mode. For example, assume the UP ARROW or DOWN ARROW key is pressed; the menu controls will now operate through the tilts at a fixed time, e.g., you can go to the lowest tilt by selecting the First Frame iconified button. Best Res Base Products \uf0c1 This section is divided into two parts. The upper part lists individual products: four base products and three dual-pol products (ZDR, CC, and KDP). The lower part includes submenus for accessing multiple products and applications. The following describes the submenus grouped in the lower part of the Best Res Base Products section. Precip : In addition to the QPE dual-pol products, this submenu includes the legacy precip products, which include Storm Total, One Hour, Three Hour, and User Selectable precipitation products. A suite of snowfall products is also available on the Precip submenu. All are available for request (OTR, RMR), and the first four can be added to an RPS (Routine Product Set) list. All of these products are available on any scale. Derived Products : The Derived Products submenu includes Layer Reflectivity, Cross Section, and Other products displayed on any scale. Derived products include precipitation, storm (mesocyclone, hail, tornado), and wind derivations. Algorithm Overlays : The Algorithm Overlays submenu includes legacy algorithm overlays and the ML dual-pol overlay. four panel : The four panel submenu includes menu entries for Z+V, Z+SRM 8- and 4-bit, and some other combinations that are presented in 4-panel mode, with a different elevation angle or product in each panel. Data Quality : The Data Quality products, accessible by a pull-right submenu, include Clutter Filter Control and reflectivity and velocity clutter probability products. 4-bit/Legacy Prods : The 4-bit/Legacy Prods submenu uses generic selectors that load 8-bit (256 level) data, with legacy 4-bit (16 level) and 3-bit (8 level) data filling in when no 8-bit data is available. Radar Applications : The Radar Applications submenu provides access to all the radar applications and radar tools. MRMS \uf0c1 FNEXRAD Composites \uf0c1 DHR \uf0c1 DLV \uf0c1 EET \uf0c1 HHC \uf0c1 DAA \uf0c1 DTA \uf0c1 Mosaic Radar Plots \uf0c1 Mosaics available via this menu use data from up to nine nearby radars. Additional optional mosaics on cascading menus provide a limited list of radar products from a predefined set of WSR-88D radars within a given region. Your System Manager or site Administrator can set up such mosaics by: /awips2/edex/data/utility/common_static/site//radar/radarInUse.txt . A mosaicInfo.txt table will only work while logged on to an AWIPS workstation. N0Q \uf0c1 DSP \uf0c1 DTA \uf0c1 DAA \uf0c1 Radar Applications \uf0c1 Estimated Actual Velocity (EAV) \uf0c1 A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed. Four-dimensional Stormcell Investigator (FSI) \uf0c1 The Four-dimensional Stormcell Investigator (FSI) was developed by the National Severe Storms Laboratory for its Warning Decision Support System Integrated Information. This technology allows users to create and manipulate dynamic cross-sections (both vertical and at constant altitude), such that one can \u201cslice and dice\u201d storms and view these data in three-dimensions and across time. V-R Shear \uf0c1 This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click B3 to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click B2 on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"NEXRAD Radar Display"},{"location":"cave/d2d-radar/#nexrad-radar-display","text":"The Unidata D2D Perspective features a selectable NEXRAD station display over a loop of the FNEXRAD Digital Hybrid Reflectivity product. Selecting any station will open a two-panel reflectivity and velocity view for the selected station.","title":"NEXRAD Radar Display"},{"location":"cave/d2d-radar/#nexrad-tdwr-station-menus","text":"Individual NEXRAD station menus are accessible in Radar > NEXRAD Stations and are grouped alphabetically for a condensed submenu structure. With only the NEXRAD3 feedtype (NEXAD2 being disabled), notice that only some of the menu items will out with available data.","title":"NEXRAD & TDWR Station Menus"},{"location":"cave/d2d-radar/#best-res-zsrm8-zv","text":"The radar combination products Z+SRM and Z+V are precombined formats of the reflectivity and storm relative motion or velocity, displayed together via a single menu selection. SRM products include the storm motion vector information, which is plotted in the upper left corner of the Main Display Pane.","title":"Best Res Z+SRM8 / Z+V"},{"location":"cave/d2d-radar/#4-panel-zsrm-zdrv-kdphc-ccsw","text":"","title":"4-panel Z+SRM, ZDR+V, KDP+HC, CC+SW"},{"location":"cave/d2d-radar/#4-panel-z-zdr-hckdp-cc","text":"This section enables you to load multiple base and dual-pol products, which are then simultaneously displayed. The label of this section of the menu describes the format for loading the products: Z+SRM in the upper left quadrant, ZDR+V in the upper right quadrant, KDP+HC in the lower left quadrant, and CC+SW in the lower right quadrant. Primary dual-pol base data analysis is best accomplished using the All Tilts base data option (4 panel all tilts with 8 products loaded), though you may use the single tilts (e.g., 0.5 base data) for longer time duration loops. To load 4 panel displays containing multiple elevation angles of the same product, you would select the four panel option and then select the desired set of 4 panels from the four panel submenu. All Tilts allows you to step or animate in either space or time. Selecting one of the All Tilts buttons will load all the tilts available from the latest volume scan. It will continue to load tilts from previous volume scans until it has loaded as many frames as indicated on the frame count menu. Auto updates will add higher tilts from the latest volume scan, replacing a tilt from the oldest volume. After loading an All Tilts display, Shift + LEFT ARROW and Shift + RIGHT ARROW and looping will take you through the frames in the order in which the system loaded them (without regard to volume scan or tilt). The UP ARROW and DOWN ARROW will step the display up or down in a volume scan allowing the tilts to change for a fixed time. The RIGHT ARROW and LEFT ARROW will step the display forward or backward through time at a fixed tilt. Once you have set the mode of motion (vertical or time), the Page Up/Page Down keys will start and adjust loop speed. To switch from vertical to time mode or from time to vertical mode, press the desired arrow key. If you hit the up or down arrow key in a standard (not All-Tilts) display, looping and stepping are disabled until you hit either the left or right arrow key or one of the stepping buttons on the menu. Once an arrow key (Left, Right, Up, Down) has been pressed, the stepping/animation controls on the main window toolbar and the Page Up/Page Down keys will function in that same mode. For example, assume the UP ARROW or DOWN ARROW key is pressed; the menu controls will now operate through the tilts at a fixed time, e.g., you can go to the lowest tilt by selecting the First Frame iconified button.","title":"4-panel Z, ZDR, HC+KDP, CC"},{"location":"cave/d2d-radar/#best-res-base-products","text":"This section is divided into two parts. The upper part lists individual products: four base products and three dual-pol products (ZDR, CC, and KDP). The lower part includes submenus for accessing multiple products and applications. The following describes the submenus grouped in the lower part of the Best Res Base Products section. Precip : In addition to the QPE dual-pol products, this submenu includes the legacy precip products, which include Storm Total, One Hour, Three Hour, and User Selectable precipitation products. A suite of snowfall products is also available on the Precip submenu. All are available for request (OTR, RMR), and the first four can be added to an RPS (Routine Product Set) list. All of these products are available on any scale. Derived Products : The Derived Products submenu includes Layer Reflectivity, Cross Section, and Other products displayed on any scale. Derived products include precipitation, storm (mesocyclone, hail, tornado), and wind derivations. Algorithm Overlays : The Algorithm Overlays submenu includes legacy algorithm overlays and the ML dual-pol overlay. four panel : The four panel submenu includes menu entries for Z+V, Z+SRM 8- and 4-bit, and some other combinations that are presented in 4-panel mode, with a different elevation angle or product in each panel. Data Quality : The Data Quality products, accessible by a pull-right submenu, include Clutter Filter Control and reflectivity and velocity clutter probability products. 4-bit/Legacy Prods : The 4-bit/Legacy Prods submenu uses generic selectors that load 8-bit (256 level) data, with legacy 4-bit (16 level) and 3-bit (8 level) data filling in when no 8-bit data is available. Radar Applications : The Radar Applications submenu provides access to all the radar applications and radar tools.","title":"Best Res Base Products"},{"location":"cave/d2d-radar/#mrms","text":"","title":"MRMS"},{"location":"cave/d2d-radar/#fnexrad-composites","text":"","title":"FNEXRAD Composites"},{"location":"cave/d2d-radar/#dhr","text":"","title":"DHR"},{"location":"cave/d2d-radar/#dlv","text":"","title":"DLV"},{"location":"cave/d2d-radar/#eet","text":"","title":"EET"},{"location":"cave/d2d-radar/#hhc","text":"","title":"HHC"},{"location":"cave/d2d-radar/#daa","text":"","title":"DAA"},{"location":"cave/d2d-radar/#dta","text":"","title":"DTA"},{"location":"cave/d2d-radar/#mosaic-radar-plots","text":"Mosaics available via this menu use data from up to nine nearby radars. Additional optional mosaics on cascading menus provide a limited list of radar products from a predefined set of WSR-88D radars within a given region. Your System Manager or site Administrator can set up such mosaics by: /awips2/edex/data/utility/common_static/site//radar/radarInUse.txt . A mosaicInfo.txt table will only work while logged on to an AWIPS workstation.","title":"Mosaic Radar Plots"},{"location":"cave/d2d-radar/#n0q","text":"","title":"N0Q"},{"location":"cave/d2d-radar/#dsp","text":"","title":"DSP"},{"location":"cave/d2d-radar/#dta_1","text":"","title":"DTA"},{"location":"cave/d2d-radar/#daa_1","text":"","title":"DAA"},{"location":"cave/d2d-radar/#radar-applications","text":"","title":"Radar Applications"},{"location":"cave/d2d-radar/#estimated-actual-velocity-eav","text":"A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.","title":"Estimated Actual Velocity (EAV)"},{"location":"cave/d2d-radar/#four-dimensional-stormcell-investigator-fsi","text":"The Four-dimensional Stormcell Investigator (FSI) was developed by the National Severe Storms Laboratory for its Warning Decision Support System Integrated Information. This technology allows users to create and manipulate dynamic cross-sections (both vertical and at constant altitude), such that one can \u201cslice and dice\u201d storms and view these data in three-dimensions and across time.","title":"Four-dimensional Stormcell Investigator (FSI)"},{"location":"cave/d2d-radar/#v-r-shear","text":"This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click B3 to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click B2 on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"V-R Shear"},{"location":"cave/d2d-satellite/","text":"NOAAport GINI imagery \uf0c1 Uniwisc McIDAS AREA files \uf0c1 VIIRS \uf0c1 VIIRS is one of five instruments onboard the NPP satellite. VIIRS' mission is to collect radiometric imagery in visible and infrared wavelengths of the Earth's surface; this includes observing fires, ice, ocean color, vegetation, clouds, and land and sea surface temperatures, and supplying high-resolution images and data used by meteorologists to assess climate change and improve short-term weather forecasting. The VIIRS submenu option provides VIIRS imagery and moderate band satellite displays for the CONUS, Alaska, and Pacific regions. In addition to accessing the NPP Product VIIRS data via the Satellite menu, the VIIRS Imagery data can also be accessed using the Product Browser . GOES and POES Sounding Data \uf0c1 GOES and POES Sounding Data Availability Plots displays the locations where GOES and POES temperature and moisture profiles are available. These soundings are displayed on a Skew-T/log P chart using the Points tool and the Volume Browser. Soundings from the GOES satellites are made only in relatively cloud-free areas, whereas POES systems produce temperature and moisture soundings in clear and cloudy atmospheres. Each hour, NESDIS provides the latest soundings from GOES East and West. Although the GOES East and West sounders yield soundings over a broad area, the default AWIPS configuration retains soundings only from within each site's Regional CAVE scale domain. POES soundings are generated approximately every 12 hours and have more global coverage. POES Imagery \uf0c1 The POES Imagery section of the Satellite menu contains selectors for IR Window, Visible, 3.7\u00b5, and 11-3.7\u00b5 products. These are viewable on all scales. Sounder Imagery \uf0c1 The products available from the Sounder Imagery submenu are based purely on the imager instruments aboard the GOES East (GE) and GOES West (GW) satellites. Derived Products Imagery \uf0c1 A variety of precipitation products are accessible from the Derived Products submenu. These products are derived from one or more of the various satellites (e.g., DMPS, POES, GOES, and GPS). Descriptions of the products follow. The Blended Rain Rate (formerly Rainfall Rate) product is produced hourly to gather recent rain rate retrievals from passive microwave instruments on six polar-orbiting satellites. The blended rain rate eliminates the bias between those data sets and provides a unified, meteorologically significant rain rate field to weather forecasters. The GOES products derived from the GOES satellite include Lifted Index, Total Precip Water (TPW), Cloud Amount, Cloud Top Height, Skin Temperature, and Low Cloud Base. Because the imagery from these products is based on the GOES sounder instrument, several important differences exist between these products and the other (imager-based) imagery. The main differences are that the resolution is no finer than 10 km, the product update frequency is driven by the sounder instrument (AWIPS receives a set of GOES East/West composite derived product images once per hour), and the aerial coverage is based on that of the sounder scans, which is somewhat less than the aerial coverage provided by the imager. Descriptions of the products follow. Lifted Index is a common measure of instability. Its value is obtained by computing the temperature that air near the ground would have if it were lifted to some higher level (usually around 18,000 feet), and comparing that temperature to the actual temperature at that level. The more negative the value, the more instability there is. Total Precip Water is the vertically integrated water vapor content in a column extending from the earth's surface to the top of the atmosphere. Cloud Amount provides an hourly update of cloud amounts within a geostationary satellite field of view. You can loop through the display to identify increasing/decreasing cloud conditions and trends. Cloud Top Height is the height of the cloud in thousands of feet (base - top). Skin Temperature is the sea surface temperature of the ocean surface water. Low Cloud Base provides nighttime images of fog and low stratus clouds derived from a combination of two GOES IR channels. This product identifies cloud ceilings of <1000 feet and is generated hourly starting between 2042 and 2142 GMT, and ending between 1510 and 1610 GMT the next day. This product is beneficial to the warning and forecast processes specific to aviation and terminal forecasting The Total Precip Water (TPW) value can also be derived from the data sources of DMSP, SSM/I (Defense Meteorological Satellite Program Special Sensor Microwave / Imager), and POES AMSU (POES Advanced Microwave Sounding Unit) satellites, which are accessed from the DMSP SSM/I, and POES AMSU sections of the submenu. Variations of TPW (\"Blended Total Precip Water\" and \"Percent of Normal TPW\") are selectable under the AMSU and SSM/I + GPS section. The Blended Total Precip Water product is a blend of the various data sources of AMSU, SSM/I, and GPS satellites, and can be over water or land. The Percent of Normal TPW product is calculated at various times (hourly, monthly, seasonally, etc.) to determine departures from the normal. From the information obtained, forecasters can predict the chances of having a below average, normal, or above average precipitation in the upcoming months. SSM/I Point Data \uf0c1 SSM/I Point Data plot displays data collected over the course of a day for calculating ocean wind speeds. GOES High Density Winds \uf0c1 GOES High Density Winds submenu has options to display satellite-derived multi-layer winds plots from the IR, Visible, and three Water Vapor channels. In addition, you can display individual layers that display a composite of all the satellite channels. MTSAT High Density Winds \uf0c1 MTSAT High Density Winds cover the Western Pacific. ASCAT winds (25 km) \uf0c1 Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOP-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) are launchable from both the CAVE Satellite menu and the Upper Air menu. The ASCAT instrument generates ocean surface wind retrievals. The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales.","title":"D2d satellite"},{"location":"cave/d2d-satellite/#noaaport-gini-imagery","text":"","title":"NOAAport GINI imagery"},{"location":"cave/d2d-satellite/#uniwisc-mcidas-area-files","text":"","title":"Uniwisc McIDAS AREA files"},{"location":"cave/d2d-satellite/#viirs","text":"VIIRS is one of five instruments onboard the NPP satellite. VIIRS' mission is to collect radiometric imagery in visible and infrared wavelengths of the Earth's surface; this includes observing fires, ice, ocean color, vegetation, clouds, and land and sea surface temperatures, and supplying high-resolution images and data used by meteorologists to assess climate change and improve short-term weather forecasting. The VIIRS submenu option provides VIIRS imagery and moderate band satellite displays for the CONUS, Alaska, and Pacific regions. In addition to accessing the NPP Product VIIRS data via the Satellite menu, the VIIRS Imagery data can also be accessed using the Product Browser .","title":"VIIRS"},{"location":"cave/d2d-satellite/#goes-and-poes-sounding-data","text":"GOES and POES Sounding Data Availability Plots displays the locations where GOES and POES temperature and moisture profiles are available. These soundings are displayed on a Skew-T/log P chart using the Points tool and the Volume Browser. Soundings from the GOES satellites are made only in relatively cloud-free areas, whereas POES systems produce temperature and moisture soundings in clear and cloudy atmospheres. Each hour, NESDIS provides the latest soundings from GOES East and West. Although the GOES East and West sounders yield soundings over a broad area, the default AWIPS configuration retains soundings only from within each site's Regional CAVE scale domain. POES soundings are generated approximately every 12 hours and have more global coverage.","title":"GOES and POES Sounding Data"},{"location":"cave/d2d-satellite/#poes-imagery","text":"The POES Imagery section of the Satellite menu contains selectors for IR Window, Visible, 3.7\u00b5, and 11-3.7\u00b5 products. These are viewable on all scales.","title":"POES Imagery"},{"location":"cave/d2d-satellite/#sounder-imagery","text":"The products available from the Sounder Imagery submenu are based purely on the imager instruments aboard the GOES East (GE) and GOES West (GW) satellites.","title":"Sounder Imagery"},{"location":"cave/d2d-satellite/#derived-products-imagery","text":"A variety of precipitation products are accessible from the Derived Products submenu. These products are derived from one or more of the various satellites (e.g., DMPS, POES, GOES, and GPS). Descriptions of the products follow. The Blended Rain Rate (formerly Rainfall Rate) product is produced hourly to gather recent rain rate retrievals from passive microwave instruments on six polar-orbiting satellites. The blended rain rate eliminates the bias between those data sets and provides a unified, meteorologically significant rain rate field to weather forecasters. The GOES products derived from the GOES satellite include Lifted Index, Total Precip Water (TPW), Cloud Amount, Cloud Top Height, Skin Temperature, and Low Cloud Base. Because the imagery from these products is based on the GOES sounder instrument, several important differences exist between these products and the other (imager-based) imagery. The main differences are that the resolution is no finer than 10 km, the product update frequency is driven by the sounder instrument (AWIPS receives a set of GOES East/West composite derived product images once per hour), and the aerial coverage is based on that of the sounder scans, which is somewhat less than the aerial coverage provided by the imager. Descriptions of the products follow. Lifted Index is a common measure of instability. Its value is obtained by computing the temperature that air near the ground would have if it were lifted to some higher level (usually around 18,000 feet), and comparing that temperature to the actual temperature at that level. The more negative the value, the more instability there is. Total Precip Water is the vertically integrated water vapor content in a column extending from the earth's surface to the top of the atmosphere. Cloud Amount provides an hourly update of cloud amounts within a geostationary satellite field of view. You can loop through the display to identify increasing/decreasing cloud conditions and trends. Cloud Top Height is the height of the cloud in thousands of feet (base - top). Skin Temperature is the sea surface temperature of the ocean surface water. Low Cloud Base provides nighttime images of fog and low stratus clouds derived from a combination of two GOES IR channels. This product identifies cloud ceilings of <1000 feet and is generated hourly starting between 2042 and 2142 GMT, and ending between 1510 and 1610 GMT the next day. This product is beneficial to the warning and forecast processes specific to aviation and terminal forecasting The Total Precip Water (TPW) value can also be derived from the data sources of DMSP, SSM/I (Defense Meteorological Satellite Program Special Sensor Microwave / Imager), and POES AMSU (POES Advanced Microwave Sounding Unit) satellites, which are accessed from the DMSP SSM/I, and POES AMSU sections of the submenu. Variations of TPW (\"Blended Total Precip Water\" and \"Percent of Normal TPW\") are selectable under the AMSU and SSM/I + GPS section. The Blended Total Precip Water product is a blend of the various data sources of AMSU, SSM/I, and GPS satellites, and can be over water or land. The Percent of Normal TPW product is calculated at various times (hourly, monthly, seasonally, etc.) to determine departures from the normal. From the information obtained, forecasters can predict the chances of having a below average, normal, or above average precipitation in the upcoming months.","title":"Derived Products Imagery"},{"location":"cave/d2d-satellite/#ssmi-point-data","text":"SSM/I Point Data plot displays data collected over the course of a day for calculating ocean wind speeds.","title":"SSM/I Point Data"},{"location":"cave/d2d-satellite/#goes-high-density-winds","text":"GOES High Density Winds submenu has options to display satellite-derived multi-layer winds plots from the IR, Visible, and three Water Vapor channels. In addition, you can display individual layers that display a composite of all the satellite channels.","title":"GOES High Density Winds"},{"location":"cave/d2d-satellite/#mtsat-high-density-winds","text":"MTSAT High Density Winds cover the Western Pacific.","title":"MTSAT High Density Winds"},{"location":"cave/d2d-satellite/#ascat-winds-25-km","text":"Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOP-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) are launchable from both the CAVE Satellite menu and the Upper Air menu. The ASCAT instrument generates ocean surface wind retrievals. The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales.","title":"ASCAT winds (25 km)"},{"location":"cave/d2d-tools/","text":"Display Tools \uf0c1 The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend . Note : To see information about some of the other options in the Tools menu, check out the Radar Tools page. Az/Ran Overlay \uf0c1 This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button. Baselines \uf0c1 Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime. Choose By ID \uf0c1 Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser. Distance Bearing \uf0c1 Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted. Distance Speed \uf0c1 This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker. Distance Scale \uf0c1 Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest. Feature Following Zoom \uf0c1 When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom. Home \uf0c1 Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it. Clicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location. Points \uf0c1 The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option of selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be unhidden using the Points List dialog box, where you would uncheck the checkbox under the \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is selected, a pop-up opens to confirm whether you want to delete the Point. Selecting the \"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to review the list of Points assigned to that group by clicking the arrow next to the group name. Initially, the default set of Points (A-J) are listed in the D2D Group, as shown above. In the Points List dialog box, Points and groups may be dragged into and out of other groups to create or disassemble subgroups. The Points List dialog box also includes three columns. Point Name : Lists the group name and designated Points. Movable : Checking the checkbox adjacent to the Point disables the Point from being moved. Hidden : Checking the checkbox adjacent to the Point hides the Point on the map. Put home cursor \uf0c1 The Put home cursor tool provides an easy way to locate a METAR observation station, a city and state, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location (station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box. Another way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the Station, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location. Range Rings \uf0c1 The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site. Sunset/Sunrise \uf0c1 By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset. Text Window \uf0c1 Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled. Time of Arrival / Lead Time \uf0c1 Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s). Units Calculator \uf0c1 This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box. Text Workstation \uf0c1 By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the Unidata AWIPS version.","title":"Display Tools"},{"location":"cave/d2d-tools/#display-tools","text":"The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend . Note : To see information about some of the other options in the Tools menu, check out the Radar Tools page.","title":"Display Tools"},{"location":"cave/d2d-tools/#azran-overlay","text":"This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button.","title":"Az/Ran Overlay"},{"location":"cave/d2d-tools/#baselines","text":"Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime.","title":"Baselines"},{"location":"cave/d2d-tools/#choose-by-id","text":"Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser.","title":"Choose By ID"},{"location":"cave/d2d-tools/#distance-bearing","text":"Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted.","title":"Distance Bearing"},{"location":"cave/d2d-tools/#distance-speed","text":"This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker.","title":"Distance Speed"},{"location":"cave/d2d-tools/#distance-scale","text":"Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest.","title":"Distance Scale"},{"location":"cave/d2d-tools/#feature-following-zoom","text":"When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom.","title":"Feature Following Zoom"},{"location":"cave/d2d-tools/#home","text":"Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it. Clicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location.","title":"Home"},{"location":"cave/d2d-tools/#points","text":"The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option of selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be unhidden using the Points List dialog box, where you would uncheck the checkbox under the \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is selected, a pop-up opens to confirm whether you want to delete the Point. Selecting the \"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to review the list of Points assigned to that group by clicking the arrow next to the group name. Initially, the default set of Points (A-J) are listed in the D2D Group, as shown above. In the Points List dialog box, Points and groups may be dragged into and out of other groups to create or disassemble subgroups. The Points List dialog box also includes three columns. Point Name : Lists the group name and designated Points. Movable : Checking the checkbox adjacent to the Point disables the Point from being moved. Hidden : Checking the checkbox adjacent to the Point hides the Point on the map.","title":"Points"},{"location":"cave/d2d-tools/#put-home-cursor","text":"The Put home cursor tool provides an easy way to locate a METAR observation station, a city and state, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location (station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box. Another way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the Station, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location.","title":"Put home cursor"},{"location":"cave/d2d-tools/#range-rings","text":"The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site.","title":"Range Rings"},{"location":"cave/d2d-tools/#sunsetsunrise","text":"By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset.","title":"Sunset/Sunrise"},{"location":"cave/d2d-tools/#text-window","text":"Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled.","title":"Text Window"},{"location":"cave/d2d-tools/#time-of-arrival-lead-time","text":"Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s).","title":"Time of Arrival / Lead Time"},{"location":"cave/d2d-tools/#units-calculator","text":"This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box.","title":"Units Calculator"},{"location":"cave/d2d-tools/#text-workstation","text":"By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the Unidata AWIPS version.","title":"Text Workstation"},{"location":"cave/d2d-uair/","text":"The Upper Air dropdown menu provides access to upper air plots, profiler data, radar plan-view and perspective displays of winds, and aircraft and rawinsonde data. Nearby Radiosonde Observations (RAOB) are also included on the menu to provide easy viewing of upper air data. NSHARP Upper Air Soundings \uf0c1 RAOB data is plotted on the standard Skew-T log-p thermodynamic diagram. A small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the main display pane. If you overlay another Skew-T whose location is far from the original sounding location, the reference map updates to show both locations. NUCAPS Soundings \uf0c1 The NOAA Unique CrIS/ATMS Processing System ( NUCAPS ) soundings are derived from processing of CrIS/ATMS data, provides cloud cleared radiances and trace gas that enable increased accuracy in the development of the vertical profile of temperature and water vapor retrievals. By clicking on the individual dots, the forecaster is able to render the sounding for the selected point using the NSHARP plugin . Upper Air Plots \uf0c1 NCEP: 200mb to 850mb RAOB: 150mb to 925mb UKMO 500mb Height \uf0c1 500mb height graphic out to 144 forecast hours. CPC Charts \uf0c1 6-10 day mean 500mb Height 8-14 day mean 500mb Height 6-10 day 500mb Height Anomaly 8-14 day 500mb Height Anomaly NPN Profiler Time-Height \uf0c1 NOAA Profiler Network ( NPN ) observations as a time-series plot. This time-series plugin is also used in the Volume Browser plugin for both grids and observations. NPN Profiler Plot \uf0c1 200hPa-925hPa 1500m-500m AGL Surface Radar VWP Height-Level \uf0c1 15km AGL 14km AGL 13km AGL ... 500m AGL 250m AGL 100m AGL Radar VWP Pressure-Level \uf0c1 200hPa to 925hPa PIREP Aircraft Plot \uf0c1 The Aircraft data includes Low-, Mid-, and High-level Pilot Weather Report (PIREP) observations. The display plots the temperature, aircraft identifier, wind speed and direction, significant weather, and the flight level (in feet). Pilot reports are critical for air safety. Pilots reports on the conditions they are experiencing show up in a matter of minutes on AWIPS. Weather conditions can change quickly, and there is nothing like having a pilot report to provide a bird's eye view of what it is really like up there. PIREPs may validate forecast conditions, or they may describe real-time weather that varies from them. Icing: Low Level, Mid Level, High Level Tubulence: Low Level Mid Level, High Level Aircraft MDCRS \uf0c1 Meteorological Data Collection and Reporting System (MDCRS) data includes plan-view plots for various 5kft layers and ascent/descent soundings. Using the availability plots (Upper Air menu under MDCRS plots) and ACARS Airports from the Maps menu button you can locate airports that have available soundings. ACARS Airports provides an illustration of locations of airports, but it is not necessary to use it. The \"+\" sign means a temperature sounding and the \"*\" means a temperature and dewpoint sounding. To see a sounding at a location, simply press the Points menu button. Several points from letters of the alphabet will appear on the map display. To view a sounding, drag one of the points/letters to a \"+\" or \"*\" location. From the menu bar press Volume and then Browser. From the Volume Browser select MDCRS for Source, Sounding for Fields and select the letter/point on the desired location for Points. Click on your selection in the Product Selection List and then press the Load button to view the sounding. A zoomable inset map (NW corner) is available to show the location of the sounding. When you zoom in by clicking mouse Button 2 (B2), the flight track of the ascent/descent sounding is shown on the map. In addition, you can sample the flight track to see the time and elevation. To zoom out, click mouse Button 1 (B1). This inset map (and also those on var vs. height displays, cross sections, and cell trends) can be suppressed by setting the global density (i.e., from the tool bar) at less than 1. 000-500hft in 50ft increments 1 hour profile availability 6 hour profile availability SIGMET and AIRMET reports: Convective, Icing, Turbulance, Tropical, Volcanic \uf0c1 SIGMET \uf0c1 SIGMET (Significant Meteorological Information) is an alphanumeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450). A SIGMET includes information about the location of the hazard using VOR locations. SIGMETs are produced on an as-needed basis at the AWC and are distributed on the SBN. AIRMET \uf0c1 AIRMET (Airmen's Meteorological Information) is an alpha-numeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450), but not requiring the issuance of a SIGMET. An AIRMET includes information about the location of the hazard using VOR locations. AIRMETs are produced every 6 hours at the AWC for the CONUS area, and are distributed on the SBN. Visibility Products \uf0c1 IFR, Mountain Obscn \uf0c1 Medium Level, High Level \uf0c1","title":"D2d uair"},{"location":"cave/d2d-uair/#nsharp-upper-air-soundings","text":"RAOB data is plotted on the standard Skew-T log-p thermodynamic diagram. A small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the main display pane. If you overlay another Skew-T whose location is far from the original sounding location, the reference map updates to show both locations.","title":"NSHARP Upper Air Soundings"},{"location":"cave/d2d-uair/#nucaps-soundings","text":"The NOAA Unique CrIS/ATMS Processing System ( NUCAPS ) soundings are derived from processing of CrIS/ATMS data, provides cloud cleared radiances and trace gas that enable increased accuracy in the development of the vertical profile of temperature and water vapor retrievals. By clicking on the individual dots, the forecaster is able to render the sounding for the selected point using the NSHARP plugin .","title":"NUCAPS Soundings"},{"location":"cave/d2d-uair/#upper-air-plots","text":"NCEP: 200mb to 850mb RAOB: 150mb to 925mb","title":"Upper Air Plots"},{"location":"cave/d2d-uair/#ukmo-500mb-height","text":"500mb height graphic out to 144 forecast hours.","title":"UKMO 500mb Height"},{"location":"cave/d2d-uair/#cpc-charts","text":"6-10 day mean 500mb Height 8-14 day mean 500mb Height 6-10 day 500mb Height Anomaly 8-14 day 500mb Height Anomaly","title":"CPC Charts"},{"location":"cave/d2d-uair/#npn-profiler-time-height","text":"NOAA Profiler Network ( NPN ) observations as a time-series plot. This time-series plugin is also used in the Volume Browser plugin for both grids and observations.","title":"NPN Profiler Time-Height"},{"location":"cave/d2d-uair/#npn-profiler-plot","text":"200hPa-925hPa 1500m-500m AGL Surface","title":"NPN Profiler Plot"},{"location":"cave/d2d-uair/#radar-vwp-height-level","text":"15km AGL 14km AGL 13km AGL ... 500m AGL 250m AGL 100m AGL","title":"Radar VWP Height-Level"},{"location":"cave/d2d-uair/#radar-vwp-pressure-level","text":"200hPa to 925hPa","title":"Radar VWP Pressure-Level"},{"location":"cave/d2d-uair/#pirep-aircraft-plot","text":"The Aircraft data includes Low-, Mid-, and High-level Pilot Weather Report (PIREP) observations. The display plots the temperature, aircraft identifier, wind speed and direction, significant weather, and the flight level (in feet). Pilot reports are critical for air safety. Pilots reports on the conditions they are experiencing show up in a matter of minutes on AWIPS. Weather conditions can change quickly, and there is nothing like having a pilot report to provide a bird's eye view of what it is really like up there. PIREPs may validate forecast conditions, or they may describe real-time weather that varies from them. Icing: Low Level, Mid Level, High Level Tubulence: Low Level Mid Level, High Level","title":"PIREP Aircraft Plot"},{"location":"cave/d2d-uair/#aircraft-mdcrs","text":"Meteorological Data Collection and Reporting System (MDCRS) data includes plan-view plots for various 5kft layers and ascent/descent soundings. Using the availability plots (Upper Air menu under MDCRS plots) and ACARS Airports from the Maps menu button you can locate airports that have available soundings. ACARS Airports provides an illustration of locations of airports, but it is not necessary to use it. The \"+\" sign means a temperature sounding and the \"*\" means a temperature and dewpoint sounding. To see a sounding at a location, simply press the Points menu button. Several points from letters of the alphabet will appear on the map display. To view a sounding, drag one of the points/letters to a \"+\" or \"*\" location. From the menu bar press Volume and then Browser. From the Volume Browser select MDCRS for Source, Sounding for Fields and select the letter/point on the desired location for Points. Click on your selection in the Product Selection List and then press the Load button to view the sounding. A zoomable inset map (NW corner) is available to show the location of the sounding. When you zoom in by clicking mouse Button 2 (B2), the flight track of the ascent/descent sounding is shown on the map. In addition, you can sample the flight track to see the time and elevation. To zoom out, click mouse Button 1 (B1). This inset map (and also those on var vs. height displays, cross sections, and cell trends) can be suppressed by setting the global density (i.e., from the tool bar) at less than 1. 000-500hft in 50ft increments 1 hour profile availability 6 hour profile availability","title":"Aircraft MDCRS"},{"location":"cave/d2d-uair/#sigmet-and-airmet-reports-convective-icing-turbulance-tropical-volcanic","text":"","title":"SIGMET and AIRMET reports: Convective, Icing, Turbulance, Tropical, Volcanic"},{"location":"cave/d2d-uair/#sigmet","text":"SIGMET (Significant Meteorological Information) is an alphanumeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450). A SIGMET includes information about the location of the hazard using VOR locations. SIGMETs are produced on an as-needed basis at the AWC and are distributed on the SBN.","title":"SIGMET"},{"location":"cave/d2d-uair/#airmet","text":"AIRMET (Airmen's Meteorological Information) is an alpha-numeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450), but not requiring the issuance of a SIGMET. An AIRMET includes information about the location of the hazard using VOR locations. AIRMETs are produced every 6 hours at the AWC for the CONUS area, and are distributed on the SBN.","title":"AIRMET"},{"location":"cave/d2d-uair/#visibility-products","text":"","title":"Visibility Products"},{"location":"cave/d2d-uair/#ifr-mountain-obscn","text":"","title":"IFR, Mountain Obscn"},{"location":"cave/d2d-uair/#medium-level-high-level","text":"","title":"Medium Level, High Level"},{"location":"cave/goes-16-17-satellite/","text":"GOES 16/17 \uf0c1 The goesr EDEX decoder supports the ingest of GOES products coming over NOAAPort and Unidata's IDD. These include single channel imagery , derived products (Level 2b netCDF files), gridded Geostationary Lightning Mapper (GLM) products (produced by Eric Bruning at Texas Tech), CIRA created RGB specific products, and vertical temperature/moisture profiles . Using derived parameters, additional RGB and channel difference products can be loaded. The dmw EDEX decoder supports the ingest of GOES derived motion winds . GOES East and West products are accessible in the Satellite menu. The menu is broken into sections starting with common CONUS GOES East/West Combo products. There are submenus for each of the separate geospatial sectors: East Full Disk East CONUS East Mesoscale Sectors (x2) West Full Disk West CONUS West Mesoscale Sectors (x2) Hawaii Alaska Puerto Rico Each sector submenu has products for individual channels and vertical profiles, as well as submenus for derived products, channel differences, RGB Composites, GLM data, and derived motion winds. GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. The RGB products are not available on MacOS or in a Virtual Machine running CAVE. LDM Pattern Actions \uf0c1 The Unidata IDD redistributes both the NOAAPort/SBN GOES tiled products as well as stitched together GOES products. While AWIPS can decode and ingest both, it's important to only be requesting from one or the other so you aren't creating duplicate processing. The entries that should be used for GOES data are shown below which is found in the LDM's pqact.conf file, located in /awips2/ldm/etc . (For the full list of pqact entries, you can view this file). # GOES 16/17 Single Channel (ABI) via Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CMI-IDD/\\5\\6\\7\\8.nc4 # GOES 16/17 derived products + derived motion wind via SBN HDS ^(IXT.[8-9]9) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) NOTHER ^(IXT[WXY]01) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) # GOES 16 GLM Gridded Products via Texas Tech-->Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeostationaryLightningMapper/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\3/\\6/GLM-IDD/\\4\\5\\6\\7.nc4 # GOES CIRA derived products NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeoColor/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/GeoColor/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/DebraDust/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/DebraDust/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudSnow/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/CloudSnow/\\5\\6\\7\\8.nc4 Individual Channels \uf0c1 All geospatial sectors have 16 individual channel products that can be viewed. Below are samples of Channel 14 (11.20\u03bcm) for each of the sectors. East CONUS 1km \uf0c1 East Full Disk 6km \uf0c1 East Mesoscale Sectors (EMESO-1, EMESO-2) \uf0c1 Two floating mesoscale sectors (location will vary day to day from image shown) West CONUS 1km \uf0c1 West Full Disk \uf0c1 West Mesoscale Sectors (WMESO-1, WMESO-2) \uf0c1 Two floating mesoscale sectors (location will vary day to day from image shown) Alaska \uf0c1 Hawaii \uf0c1 Puerto Rico (PRREGI) \uf0c1 RGB Composites \uf0c1 RGB Composites are made by combining 3 channels and are available for each sector. Quite a few new RGB products have been added in Unidata's 18.2.1 release. These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS or within a Virtual Machine OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac or in a Virtual Machine. Please use the Linux or Windows installs to view RGB products. Day Cloud Phase \uf0c1 Fire Temperature \uf0c1 Day Land Cloud \uf0c1 Day Cloud Convection \uf0c1 Day Land Cloud Fires \uf0c1 VIS/IR Sandwich \uf0c1 Simple Water Vapor \uf0c1 Air Mass \uf0c1 Ash \uf0c1 Day Convection \uf0c1 Day Snow Fog \uf0c1 Differential Water Vapor \uf0c1 Dust \uf0c1 CIMSS Natural Color \uf0c1 Nighttime Microphysics \uf0c1 SO2 \uf0c1 CIRA Geocolor \uf0c1 CIRA Debra Dust \uf0c1 CIRA Cloud Snow \uf0c1 Daytime Composite 1 \uf0c1 Daytime Composite 5 \uf0c1 Channel Differences \uf0c1 Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) The rendering of these products uses the Jep package in Python, which has specific install instructions for Windows. Derived Products \uf0c1 Derived products are also known as Level 2+ products. Currently there are only derived products from GOES East available in AWIPS. Each sector has a different set of products available. To find out some more information on some of the products please the Quick Guides compiled by CIRA. These may not all be available for each sector. The current products offered in CAVE are listed below and to the right is which GOES East sector they are available for (F=Full Disk, C=CONUS, M=Mesoscale): Aerosol Detection - F,C,M Aerosol Optical Depth - F,C Clear Sky Mask - F,C,M Cloud Optical Depth - F,C Cloud Particle Size -F,C,M Cloud Top Height -F,C,M Cloud Top Phase -F,C,M Cloud Top Pressure -F,C Cloud Top Temperature - F,M Derived CAPE - F,C,M Derived K-Index - F,C,M Derived Lifted Index - F,C,M Derived Showalter Index - F,C,M Derived Total Totals - F,C,M Fire Area - F,C Fire Power - F,C Fire Temperature - F,C Instrument Flight Rule (IFR) Probability - C Low IFR Probability - C Marginal Visual Flight Rules (MVFR) Probability - C Cloud Thickness - C Land Skin Temperature - F,C,M RR/QPE - F Sea Surface Temperature - F Total Precip Water - F,C,M Geostationary Lightning Mapper (GLM) \uf0c1 Dr. Eric Bruning at Texas Tech has taken the raw GLM data and coded up some new gridded products that can be ingested and displayed in AWIPS. Minimum Flash Area Average Flash Area Flash Extent Density Group Extent Density Total Optical Energy GLM data are located in the menu structure: Satellite > [SECTOR] > GLM Products . You can also access the data from Surface > GLM - Geostationary Lightning Mapper submenus. Derived Motion Winds \uf0c1 Derived Motion Wind Vectors are produced using sequential ABI images and can provide information about winds at different levels. The wind vectors are computed using both visible and infrared imagery. Winds can be plotted by different pressure layers or individual channels. More information can be found here . Below is an image of the winds at different pressure layers. Vertical Temperature and Moisture Profile \uf0c1 Vertical Temperature and Moisture profiles are available in AWIPS. Similar to NUCAPS, when loaded in CAVE, a circle is displayed for each location that has a vertical profile available. When clicking on the circle, NSHARP will open with the vertical temperature and moisture profile. These profiles are GFS data that have been adjusted based on the satellite observations. More information can be found here . HDF5 Data Store \uf0c1 Decoded GOES satellite data are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x awips fxalpha 4096 AKREGI drwxr-xr-x awips fxalpha 4096 Antarctic drwxr-xr-x awips fxalpha 4096 Arctic drwxr-xr-x awips fxalpha 4096 AREA0600 drwxr-xr-x awips fxalpha 4096 AREA0700 drwxr-xr-x awips fxalpha 4096 AREA3100 drwxr-xr-x awips fxalpha 4096 AREA3101 drwxr-xr-x awips fxalpha 12288 ECONUS drwxr-xr-x awips fxalpha 4096 EFD drwxr-xr-x awips fxalpha 4096 EMESO-1 drwxr-xr-x awips fxalpha 4096 EMESO-2 drwxr-xr-x awips fxalpha 4096 HIREGI drwxr-xr-x awips fxalpha 4096 NEXRCOMP drwxr-xr-x awips fxalpha 4096 PRREGI drwxr-xr-x awips fxalpha 4096 WCONUS drwxr-xr-x awips fxalpha 4096 WFD drwxr-xr-x awips fxalpha 4096 WMESO-1 drwxr-xr-x awips fxalpha 4096 WMESO-2","title":"GOES 16/17"},{"location":"cave/goes-16-17-satellite/#goes-1617","text":"The goesr EDEX decoder supports the ingest of GOES products coming over NOAAPort and Unidata's IDD. These include single channel imagery , derived products (Level 2b netCDF files), gridded Geostationary Lightning Mapper (GLM) products (produced by Eric Bruning at Texas Tech), CIRA created RGB specific products, and vertical temperature/moisture profiles . Using derived parameters, additional RGB and channel difference products can be loaded. The dmw EDEX decoder supports the ingest of GOES derived motion winds . GOES East and West products are accessible in the Satellite menu. The menu is broken into sections starting with common CONUS GOES East/West Combo products. There are submenus for each of the separate geospatial sectors: East Full Disk East CONUS East Mesoscale Sectors (x2) West Full Disk West CONUS West Mesoscale Sectors (x2) Hawaii Alaska Puerto Rico Each sector submenu has products for individual channels and vertical profiles, as well as submenus for derived products, channel differences, RGB Composites, GLM data, and derived motion winds. GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. The RGB products are not available on MacOS or in a Virtual Machine running CAVE.","title":"GOES 16/17"},{"location":"cave/goes-16-17-satellite/#ldm-pattern-actions","text":"The Unidata IDD redistributes both the NOAAPort/SBN GOES tiled products as well as stitched together GOES products. While AWIPS can decode and ingest both, it's important to only be requesting from one or the other so you aren't creating duplicate processing. The entries that should be used for GOES data are shown below which is found in the LDM's pqact.conf file, located in /awips2/ldm/etc . (For the full list of pqact entries, you can view this file). # GOES 16/17 Single Channel (ABI) via Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CMI-IDD/\\5\\6\\7\\8.nc4 # GOES 16/17 derived products + derived motion wind via SBN HDS ^(IXT.[8-9]9) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) NOTHER ^(IXT[WXY]01) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) # GOES 16 GLM Gridded Products via Texas Tech-->Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeostationaryLightningMapper/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\3/\\6/GLM-IDD/\\4\\5\\6\\7.nc4 # GOES CIRA derived products NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeoColor/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/GeoColor/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/DebraDust/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/DebraDust/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudSnow/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/CloudSnow/\\5\\6\\7\\8.nc4","title":"LDM Pattern Actions"},{"location":"cave/goes-16-17-satellite/#individual-channels","text":"All geospatial sectors have 16 individual channel products that can be viewed. Below are samples of Channel 14 (11.20\u03bcm) for each of the sectors.","title":"Individual Channels"},{"location":"cave/goes-16-17-satellite/#east-conus-1km","text":"","title":"East CONUS 1km"},{"location":"cave/goes-16-17-satellite/#east-full-disk-6km","text":"","title":"East Full Disk 6km"},{"location":"cave/goes-16-17-satellite/#east-mesoscale-sectors-emeso-1-emeso-2","text":"Two floating mesoscale sectors (location will vary day to day from image shown)","title":"East Mesoscale Sectors (EMESO-1, EMESO-2)"},{"location":"cave/goes-16-17-satellite/#west-conus-1km","text":"","title":"West CONUS 1km"},{"location":"cave/goes-16-17-satellite/#west-full-disk","text":"","title":"West Full Disk"},{"location":"cave/goes-16-17-satellite/#west-mesoscale-sectors-wmeso-1-wmeso-2","text":"Two floating mesoscale sectors (location will vary day to day from image shown)","title":"West Mesoscale Sectors (WMESO-1, WMESO-2)"},{"location":"cave/goes-16-17-satellite/#alaska","text":"","title":"Alaska"},{"location":"cave/goes-16-17-satellite/#hawaii","text":"","title":"Hawaii"},{"location":"cave/goes-16-17-satellite/#puerto-rico-prregi","text":"","title":"Puerto Rico (PRREGI)"},{"location":"cave/goes-16-17-satellite/#rgb-composites","text":"RGB Composites are made by combining 3 channels and are available for each sector. Quite a few new RGB products have been added in Unidata's 18.2.1 release. These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS or within a Virtual Machine OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac or in a Virtual Machine. Please use the Linux or Windows installs to view RGB products.","title":"RGB Composites"},{"location":"cave/goes-16-17-satellite/#day-cloud-phase","text":"","title":"Day Cloud Phase"},{"location":"cave/goes-16-17-satellite/#fire-temperature","text":"","title":"Fire Temperature"},{"location":"cave/goes-16-17-satellite/#day-land-cloud","text":"","title":"Day Land Cloud"},{"location":"cave/goes-16-17-satellite/#day-cloud-convection","text":"","title":"Day Cloud Convection"},{"location":"cave/goes-16-17-satellite/#day-land-cloud-fires","text":"","title":"Day Land Cloud Fires"},{"location":"cave/goes-16-17-satellite/#visir-sandwich","text":"","title":"VIS/IR Sandwich"},{"location":"cave/goes-16-17-satellite/#simple-water-vapor","text":"","title":"Simple Water Vapor"},{"location":"cave/goes-16-17-satellite/#air-mass","text":"","title":"Air Mass"},{"location":"cave/goes-16-17-satellite/#ash","text":"","title":"Ash"},{"location":"cave/goes-16-17-satellite/#day-convection","text":"","title":"Day Convection"},{"location":"cave/goes-16-17-satellite/#day-snow-fog","text":"","title":"Day Snow Fog"},{"location":"cave/goes-16-17-satellite/#differential-water-vapor","text":"","title":"Differential Water Vapor"},{"location":"cave/goes-16-17-satellite/#dust","text":"","title":"Dust"},{"location":"cave/goes-16-17-satellite/#cimss-natural-color","text":"","title":"CIMSS Natural Color"},{"location":"cave/goes-16-17-satellite/#nighttime-microphysics","text":"","title":"Nighttime Microphysics"},{"location":"cave/goes-16-17-satellite/#so2","text":"","title":"SO2"},{"location":"cave/goes-16-17-satellite/#cira-geocolor","text":"","title":"CIRA Geocolor"},{"location":"cave/goes-16-17-satellite/#cira-debra-dust","text":"","title":"CIRA Debra Dust"},{"location":"cave/goes-16-17-satellite/#cira-cloud-snow","text":"","title":"CIRA Cloud Snow"},{"location":"cave/goes-16-17-satellite/#daytime-composite-1","text":"","title":"Daytime Composite 1"},{"location":"cave/goes-16-17-satellite/#daytime-composite-5","text":"","title":"Daytime Composite 5"},{"location":"cave/goes-16-17-satellite/#channel-differences","text":"Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) The rendering of these products uses the Jep package in Python, which has specific install instructions for Windows.","title":"Channel Differences"},{"location":"cave/goes-16-17-satellite/#derived-products","text":"Derived products are also known as Level 2+ products. Currently there are only derived products from GOES East available in AWIPS. Each sector has a different set of products available. To find out some more information on some of the products please the Quick Guides compiled by CIRA. These may not all be available for each sector. The current products offered in CAVE are listed below and to the right is which GOES East sector they are available for (F=Full Disk, C=CONUS, M=Mesoscale): Aerosol Detection - F,C,M Aerosol Optical Depth - F,C Clear Sky Mask - F,C,M Cloud Optical Depth - F,C Cloud Particle Size -F,C,M Cloud Top Height -F,C,M Cloud Top Phase -F,C,M Cloud Top Pressure -F,C Cloud Top Temperature - F,M Derived CAPE - F,C,M Derived K-Index - F,C,M Derived Lifted Index - F,C,M Derived Showalter Index - F,C,M Derived Total Totals - F,C,M Fire Area - F,C Fire Power - F,C Fire Temperature - F,C Instrument Flight Rule (IFR) Probability - C Low IFR Probability - C Marginal Visual Flight Rules (MVFR) Probability - C Cloud Thickness - C Land Skin Temperature - F,C,M RR/QPE - F Sea Surface Temperature - F Total Precip Water - F,C,M","title":"Derived Products"},{"location":"cave/goes-16-17-satellite/#geostationary-lightning-mapper-glm","text":"Dr. Eric Bruning at Texas Tech has taken the raw GLM data and coded up some new gridded products that can be ingested and displayed in AWIPS. Minimum Flash Area Average Flash Area Flash Extent Density Group Extent Density Total Optical Energy GLM data are located in the menu structure: Satellite > [SECTOR] > GLM Products . You can also access the data from Surface > GLM - Geostationary Lightning Mapper submenus.","title":"Geostationary Lightning Mapper (GLM)"},{"location":"cave/goes-16-17-satellite/#derived-motion-winds","text":"Derived Motion Wind Vectors are produced using sequential ABI images and can provide information about winds at different levels. The wind vectors are computed using both visible and infrared imagery. Winds can be plotted by different pressure layers or individual channels. More information can be found here . Below is an image of the winds at different pressure layers.","title":"Derived Motion Winds"},{"location":"cave/goes-16-17-satellite/#vertical-temperature-and-moisture-profile","text":"Vertical Temperature and Moisture profiles are available in AWIPS. Similar to NUCAPS, when loaded in CAVE, a circle is displayed for each location that has a vertical profile available. When clicking on the circle, NSHARP will open with the vertical temperature and moisture profile. These profiles are GFS data that have been adjusted based on the satellite observations. More information can be found here .","title":"Vertical Temperature and Moisture Profile"},{"location":"cave/goes-16-17-satellite/#hdf5-data-store","text":"Decoded GOES satellite data are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x awips fxalpha 4096 AKREGI drwxr-xr-x awips fxalpha 4096 Antarctic drwxr-xr-x awips fxalpha 4096 Arctic drwxr-xr-x awips fxalpha 4096 AREA0600 drwxr-xr-x awips fxalpha 4096 AREA0700 drwxr-xr-x awips fxalpha 4096 AREA3100 drwxr-xr-x awips fxalpha 4096 AREA3101 drwxr-xr-x awips fxalpha 12288 ECONUS drwxr-xr-x awips fxalpha 4096 EFD drwxr-xr-x awips fxalpha 4096 EMESO-1 drwxr-xr-x awips fxalpha 4096 EMESO-2 drwxr-xr-x awips fxalpha 4096 HIREGI drwxr-xr-x awips fxalpha 4096 NEXRCOMP drwxr-xr-x awips fxalpha 4096 PRREGI drwxr-xr-x awips fxalpha 4096 WCONUS drwxr-xr-x awips fxalpha 4096 WFD drwxr-xr-x awips fxalpha 4096 WMESO-1 drwxr-xr-x awips fxalpha 4096 WMESO-2","title":"HDF5 Data Store"},{"location":"cave/hazard-services-alert/","text":"Alerts \uf0c1","title":"Hazard services alert"},{"location":"cave/hazard-services-alert/#alerts","text":"","title":"Alerts"},{"location":"cave/hazard-services-create/","text":"Hazard Creation Methods \uf0c1 Recommender Execution \uf0c1 Recommender Output \uf0c1 River Flood Recommender \uf0c1 Flash Flood Recommender \uf0c1 Storm Track Recommender \uf0c1 Dam/Levee Break Flood Recommender \uf0c1 Burn Scar Recommender \uf0c1 Creating a Hazard from a River Gauge \uf0c1 Selection Tools \uf0c1 Select By Area \uf0c1 Freehand Drawing \uf0c1 Manipulating Hazards \uf0c1 Adjusting a Hazard Polygon \uf0c1 Moving a Polygon Vertex \uf0c1 Deleting a Polygon Vertex \uf0c1 Adding a Polygon Vertex \uf0c1 Moving a Hazard Geometry \uf0c1 Hazard Information Dialog \uf0c1 Hazard Type \uf0c1 Time Range \uf0c1 Details (Metadata) \uf0c1 Hazard Status \uf0c1 #Propose \uf0c1 Preview \uf0c1 Product Staging Dialog \uf0c1 Product Editor \uf0c1 Issue \uf0c1 Ending and Ended \uf0c1","title":"Hazard services create"},{"location":"cave/hazard-services-create/#hazard-creation-methods","text":"","title":"Hazard Creation Methods"},{"location":"cave/hazard-services-create/#recommender-execution","text":"","title":"Recommender Execution"},{"location":"cave/hazard-services-create/#recommender-output","text":"","title":"Recommender Output"},{"location":"cave/hazard-services-create/#river-flood-recommender","text":"","title":"River Flood Recommender"},{"location":"cave/hazard-services-create/#flash-flood-recommender","text":"","title":"Flash Flood Recommender"},{"location":"cave/hazard-services-create/#storm-track-recommender","text":"","title":"Storm Track Recommender"},{"location":"cave/hazard-services-create/#damlevee-break-flood-recommender","text":"","title":"Dam/Levee Break Flood Recommender"},{"location":"cave/hazard-services-create/#burn-scar-recommender","text":"","title":"Burn Scar Recommender"},{"location":"cave/hazard-services-create/#creating-a-hazard-from-a-river-gauge","text":"","title":"Creating a Hazard from a River Gauge"},{"location":"cave/hazard-services-create/#selection-tools","text":"","title":"Selection Tools"},{"location":"cave/hazard-services-create/#select-by-area","text":"","title":"Select By Area"},{"location":"cave/hazard-services-create/#freehand-drawing","text":"","title":"Freehand Drawing"},{"location":"cave/hazard-services-create/#manipulating-hazards","text":"","title":"Manipulating Hazards"},{"location":"cave/hazard-services-create/#adjusting-a-hazard-polygon","text":"","title":"Adjusting a Hazard Polygon"},{"location":"cave/hazard-services-create/#moving-a-polygon-vertex","text":"","title":"Moving a Polygon Vertex"},{"location":"cave/hazard-services-create/#deleting-a-polygon-vertex","text":"","title":"Deleting a Polygon Vertex"},{"location":"cave/hazard-services-create/#adding-a-polygon-vertex","text":"","title":"Adding a Polygon Vertex"},{"location":"cave/hazard-services-create/#moving-a-hazard-geometry","text":"","title":"Moving a Hazard Geometry"},{"location":"cave/hazard-services-create/#hazard-information-dialog","text":"","title":"Hazard Information Dialog"},{"location":"cave/hazard-services-create/#hazard-type","text":"","title":"Hazard Type"},{"location":"cave/hazard-services-create/#time-range","text":"","title":"Time Range"},{"location":"cave/hazard-services-create/#details-metadata","text":"","title":"Details (Metadata)"},{"location":"cave/hazard-services-create/#hazard-status","text":"","title":"Hazard Status"},{"location":"cave/hazard-services-create/#propose","text":"","title":"#Propose"},{"location":"cave/hazard-services-create/#preview","text":"","title":"Preview"},{"location":"cave/hazard-services-create/#product-staging-dialog","text":"","title":"Product Staging Dialog"},{"location":"cave/hazard-services-create/#product-editor","text":"","title":"Product Editor"},{"location":"cave/hazard-services-create/#issue","text":"","title":"Issue"},{"location":"cave/hazard-services-create/#ending-and-ended","text":"","title":"Ending and Ended"},{"location":"cave/hazard-services-display/","text":"AWIPS Hazard Service Display \uf0c1 Hazard Services is a collection of AWIPS applications used by forecasters to create, update, and manage hazards, replacing and unifying hazard generation capabilities. WarnGen RiverPro GHG etc. In addition to providing a seamless forecast process for generating short-fused, long-fused, and hydrologic hazards, Hazard Services allows the forecaster to focus on the meteorology of the hazard situation, letting the system take on more of the responsibility for the generation and dissemination of products. Launching Hazard Services \uf0c1 Hazard Services can be launched from the various CAVE perspectives by selection the toolbar item \"Hazards\". When Hazard Services is first started, the Console and the Spatial Display are visible. Spatial Display and Console \uf0c1 The Spatial Display is the Hazard Services drawing layer which is loaded into the CAVE Map Editor when Hazard Services is started. It is the Hazard Services map, displaying hazard areas relative to geopolitical boundaries and handling hazard drawing and editing. Its presence is indicated by the 'Hazard Services (Editable)' line in the CAVE Map Legend, and it supports operations common to other AWIPS drawing layers. The Console is the main control panel for Hazard Services. It is always displayed if Hazard Services is running. Closing it closes Hazard Services as well. The Console is a CAVE View, by default docked within the main window. The Console includes a toolbar and a drop-down (\"view\") menu to the right of or just under its title tab. Below these is the table of hazard events. Hazard Services Toolbar \uf0c1 Hydro \uf0c1 The leftmost icon on the tool bar is an indicator if Hydro hazards are being worked or not (it will turn yellow if any active hazards are hidden from view by a filter). Setup (Settings) \uf0c1 Allows you to filter displayed hazard information to focus on the meteorological situation of concern. For example, you may want to focus only on hydrological hazards in a particular time scale and over a particular area. The Settings drop-down menu allows you to select an existing Setting or a recently-used Setting, create a new Setting, edit the current Setting, or delete the current (User) Setting. As new Settings are created, they are added to this drop-down list. The Console\u2019s title tab shows the name of the currently loaded Setting. Settings can also be viewed and edited in the Localization Perspective . Filters \uf0c1 Allows quick modification of the filters being used by the current Setting. Events may be filtered by Hazard Type, Site ID, and/or Status. As the filters are altered, the Hazard Event Table contents change to include only those hazards that pass the filters. For example, with a number of potential events possible, you can select a couple of interest, move them to pending state, and propose one. To reduce clutter in the Console you can hide potentials using the Filters menu, so that all potential events are still present but hidden in both the Console and the Spatial Display. Recommenders (Tools) \uf0c1 The Tools button reveals a drop-down menu listing all the recommenders and other tools available in the current Setting. Recommenders may be run from this menu. When you select a Setting, this menu is populated with appropriate content. Products \uf0c1 Generate RVS\u200b With an FL.x hazard selected in the Console, select this item to bring up a dialog to write an RVS text product. Correct Product\u200b Selecting the Correct Product option provides a list of products that may be corrected. The dialog includes seven columns: Product Category, Issue Time, Event IDs, Hazard Type, VTEC, Expiration Time, and User Name. You can click in a column header to order by, or type in the Search box at the bottom. Upon selecting an item from the list, the Hazard Information Dialog launches. View Product\u200b This option allows you to review issued products, selecting from a list in a Select Product to View dialog. Use the dialog to select the product type (using click, Ctrl-click, Shift-click), then click and select View Product or double-click to see the legacy text. A similar dialog will be produced by selecting the View Products for Selected Events item from the Console pop-up. In this case, the Filter/Query section is not needed, so you\u2019ll see just the lower portion of the illustrated dialog. Spatial Display Modes \uf0c1 When Hazard Services is in Editable state, three buttons set the mode of the Spatial Display, governing how it interprets mouse clicks. Drawing Tools \uf0c1 This menu has six choices: Draw Polygon\u200b When set, mouse clicks on the Spatial Display draw polygons, one click per node (MB1 click to place a node, MB3 click to complete the polygon). AddTo Polygon\u200b If a polygon is active (hazard selected), this choice allows you to augment the area or create a new separate area that will be logically joined with the current polygon. Example of the latter: Note how the single hazard now comprises two polygons. (When you select Preview, these will be joined into a single polygon for issuance.) Draw Freehand Polygon\u200b When set, mouse clicks on the Spatial Display draw freehand polygons (MB1 press, drag, and release to draw the polygon's outline). Note that issued text products will conform to current rules limiting polygon vertices to 20 or snapping areas to counties or zones. The freehand, many-vertex, shapes will be modified at some point during the hazard-issuance workflow. AddTo Freehand Polygon\u200b Similar to AddTo Polygon, but drawing is freehand. Note that you can augment both \u201csegments\u201d and freehand polygons with either of the AddTo tools. Remove Polygon Vertices\u200b In the case where you have a polygon with many vertices, it is very difficult to modify a boundary. This tool will remove a section of vertices to make the problem more tractable. With the tool selected, drag with MB1 to enclose a segment of the polygon. When you release, those vertices will be removed. Remove Polygon Area\u200b This tool provides a way to remove sections of a geometry. Press MB1 and drag out an area that intersects your geometry. Upon release, the intersection area will be removed with the new boundary along the curve you drew. If more than one hazard is selected in the Console, only Draw Polygon and Draw Freehand Polygon are available. The others are invalid and dimmed. Select Event \uf0c1 This radio button sets the mode to event selection. When set, mouse clicks on the Spatial Display select hazard events, and drags cause panning. This is the default mode choice of this set of radio buttons. Pan \uf0c1 This radio button sets the mode to pan mode. When clicked, you can pan the map without inadvertently moving or selecting polygons. Maps for Select by Area \uf0c1 The Maps for Select by Area button reveals a drop-down menu allowing the selection of maps that may be used for selecting by area within the Spatial Display. If the button is disabled, no maps that allow select-by-area are currently loaded. If the button is enabled, but a map menu item within the drop-down menu is disabled, that map is loaded but is currently invisible. Temporal Controls \uf0c1 There are two buttons used to control the Timeline view at the right side of the Hazard Table. You can also zoom and pan the Timeline using the mouse. Selected Time Mode This options menu allows you to select the time mode, either a single time or range of times. Show Current Time \u200bThis button moves the Timeline so that the current time is visible toward its left end. View Menu \uf0c1 The View menu is a drop-down menu holding menu items for functions that in general are less frequently used than those available via the toolbar.","title":"AWIPS Hazard Service Display"},{"location":"cave/hazard-services-display/#awips-hazard-service-display","text":"Hazard Services is a collection of AWIPS applications used by forecasters to create, update, and manage hazards, replacing and unifying hazard generation capabilities. WarnGen RiverPro GHG etc. In addition to providing a seamless forecast process for generating short-fused, long-fused, and hydrologic hazards, Hazard Services allows the forecaster to focus on the meteorology of the hazard situation, letting the system take on more of the responsibility for the generation and dissemination of products.","title":"AWIPS Hazard Service Display"},{"location":"cave/hazard-services-display/#launching-hazard-services","text":"Hazard Services can be launched from the various CAVE perspectives by selection the toolbar item \"Hazards\". When Hazard Services is first started, the Console and the Spatial Display are visible.","title":"Launching Hazard Services"},{"location":"cave/hazard-services-display/#spatial-display-and-console","text":"The Spatial Display is the Hazard Services drawing layer which is loaded into the CAVE Map Editor when Hazard Services is started. It is the Hazard Services map, displaying hazard areas relative to geopolitical boundaries and handling hazard drawing and editing. Its presence is indicated by the 'Hazard Services (Editable)' line in the CAVE Map Legend, and it supports operations common to other AWIPS drawing layers. The Console is the main control panel for Hazard Services. It is always displayed if Hazard Services is running. Closing it closes Hazard Services as well. The Console is a CAVE View, by default docked within the main window. The Console includes a toolbar and a drop-down (\"view\") menu to the right of or just under its title tab. Below these is the table of hazard events.","title":"Spatial Display and Console"},{"location":"cave/hazard-services-display/#hazard-services-toolbar","text":"","title":"Hazard Services Toolbar"},{"location":"cave/hazard-services-display/#hydro","text":"The leftmost icon on the tool bar is an indicator if Hydro hazards are being worked or not (it will turn yellow if any active hazards are hidden from view by a filter).","title":"Hydro"},{"location":"cave/hazard-services-display/#setup-settings","text":"Allows you to filter displayed hazard information to focus on the meteorological situation of concern. For example, you may want to focus only on hydrological hazards in a particular time scale and over a particular area. The Settings drop-down menu allows you to select an existing Setting or a recently-used Setting, create a new Setting, edit the current Setting, or delete the current (User) Setting. As new Settings are created, they are added to this drop-down list. The Console\u2019s title tab shows the name of the currently loaded Setting. Settings can also be viewed and edited in the Localization Perspective .","title":"Setup (Settings)"},{"location":"cave/hazard-services-display/#filters","text":"Allows quick modification of the filters being used by the current Setting. Events may be filtered by Hazard Type, Site ID, and/or Status. As the filters are altered, the Hazard Event Table contents change to include only those hazards that pass the filters. For example, with a number of potential events possible, you can select a couple of interest, move them to pending state, and propose one. To reduce clutter in the Console you can hide potentials using the Filters menu, so that all potential events are still present but hidden in both the Console and the Spatial Display.","title":"Filters"},{"location":"cave/hazard-services-display/#recommenders-tools","text":"The Tools button reveals a drop-down menu listing all the recommenders and other tools available in the current Setting. Recommenders may be run from this menu. When you select a Setting, this menu is populated with appropriate content.","title":"Recommenders (Tools)"},{"location":"cave/hazard-services-display/#products","text":"Generate RVS\u200b With an FL.x hazard selected in the Console, select this item to bring up a dialog to write an RVS text product. Correct Product\u200b Selecting the Correct Product option provides a list of products that may be corrected. The dialog includes seven columns: Product Category, Issue Time, Event IDs, Hazard Type, VTEC, Expiration Time, and User Name. You can click in a column header to order by, or type in the Search box at the bottom. Upon selecting an item from the list, the Hazard Information Dialog launches. View Product\u200b This option allows you to review issued products, selecting from a list in a Select Product to View dialog. Use the dialog to select the product type (using click, Ctrl-click, Shift-click), then click and select View Product or double-click to see the legacy text. A similar dialog will be produced by selecting the View Products for Selected Events item from the Console pop-up. In this case, the Filter/Query section is not needed, so you\u2019ll see just the lower portion of the illustrated dialog.","title":"Products"},{"location":"cave/hazard-services-display/#spatial-display-modes","text":"When Hazard Services is in Editable state, three buttons set the mode of the Spatial Display, governing how it interprets mouse clicks.","title":"Spatial Display Modes"},{"location":"cave/hazard-services-display/#drawing-tools","text":"This menu has six choices: Draw Polygon\u200b When set, mouse clicks on the Spatial Display draw polygons, one click per node (MB1 click to place a node, MB3 click to complete the polygon). AddTo Polygon\u200b If a polygon is active (hazard selected), this choice allows you to augment the area or create a new separate area that will be logically joined with the current polygon. Example of the latter: Note how the single hazard now comprises two polygons. (When you select Preview, these will be joined into a single polygon for issuance.) Draw Freehand Polygon\u200b When set, mouse clicks on the Spatial Display draw freehand polygons (MB1 press, drag, and release to draw the polygon's outline). Note that issued text products will conform to current rules limiting polygon vertices to 20 or snapping areas to counties or zones. The freehand, many-vertex, shapes will be modified at some point during the hazard-issuance workflow. AddTo Freehand Polygon\u200b Similar to AddTo Polygon, but drawing is freehand. Note that you can augment both \u201csegments\u201d and freehand polygons with either of the AddTo tools. Remove Polygon Vertices\u200b In the case where you have a polygon with many vertices, it is very difficult to modify a boundary. This tool will remove a section of vertices to make the problem more tractable. With the tool selected, drag with MB1 to enclose a segment of the polygon. When you release, those vertices will be removed. Remove Polygon Area\u200b This tool provides a way to remove sections of a geometry. Press MB1 and drag out an area that intersects your geometry. Upon release, the intersection area will be removed with the new boundary along the curve you drew. If more than one hazard is selected in the Console, only Draw Polygon and Draw Freehand Polygon are available. The others are invalid and dimmed.","title":"Drawing Tools"},{"location":"cave/hazard-services-display/#select-event","text":"This radio button sets the mode to event selection. When set, mouse clicks on the Spatial Display select hazard events, and drags cause panning. This is the default mode choice of this set of radio buttons.","title":"Select Event"},{"location":"cave/hazard-services-display/#pan","text":"This radio button sets the mode to pan mode. When clicked, you can pan the map without inadvertently moving or selecting polygons.","title":"Pan"},{"location":"cave/hazard-services-display/#maps-for-select-by-area","text":"The Maps for Select by Area button reveals a drop-down menu allowing the selection of maps that may be used for selecting by area within the Spatial Display. If the button is disabled, no maps that allow select-by-area are currently loaded. If the button is enabled, but a map menu item within the drop-down menu is disabled, that map is loaded but is currently invisible.","title":"Maps for Select by Area"},{"location":"cave/hazard-services-display/#temporal-controls","text":"There are two buttons used to control the Timeline view at the right side of the Hazard Table. You can also zoom and pan the Timeline using the mouse. Selected Time Mode This options menu allows you to select the time mode, either a single time or range of times. Show Current Time \u200bThis button moves the Timeline so that the current time is visible toward its left end.","title":"Temporal Controls"},{"location":"cave/hazard-services-display/#view-menu","text":"The View menu is a drop-down menu holding menu items for functions that in general are less frequently used than those available via the toolbar.","title":"View Menu"},{"location":"cave/hazard-services-example/","text":"Hazard Life Cycle \uf0c1 Transition from Product Centric toward Information Centric \uf0c1 Examples of Creating, Continuing, and Ending Hazards \uf0c1","title":"Hazard services example"},{"location":"cave/hazard-services-example/#hazard-life-cycle","text":"","title":"Hazard Life Cycle"},{"location":"cave/hazard-services-example/#transition-from-product-centric-toward-information-centric","text":"","title":"Transition from Product Centric toward Information Centric"},{"location":"cave/hazard-services-example/#examples-of-creating-continuing-and-ending-hazards","text":"","title":"Examples of Creating, Continuing, and Ending Hazards"},{"location":"cave/hazard-services-settings/","text":"Hazard Settings \uf0c1 Change Site \uf0c1 Check Hazard Conflicts \uf0c1 Auto Check Hazard Conflicts \uf0c1 Add To Selected \uf0c1 Show Hatched Areas \uf0c1 Change VTEC Mode \uf0c1 Reset Events \uf0c1 Hazard Event Table \uf0c1 Column Headers \uf0c1 Non-Timeline Headers Timeline Header Table Rows \uf0c1 Hazard History \uf0c1 Settings Overview \uf0c1 Settings Menu \uf0c1 Settings Dialog \uf0c1 Hazards Filter Tab \uf0c1 Console Tab \uf0c1 Console Coloring Tab \uf0c1 HID/Spatial Tab \uf0c1 Recommenders Tab \uf0c1 Maps/Overlays Tab \uf0c1","title":"Hazard Settings"},{"location":"cave/hazard-services-settings/#hazard-settings","text":"","title":"Hazard Settings"},{"location":"cave/hazard-services-settings/#change-site","text":"","title":"Change Site"},{"location":"cave/hazard-services-settings/#check-hazard-conflicts","text":"","title":"Check Hazard Conflicts"},{"location":"cave/hazard-services-settings/#auto-check-hazard-conflicts","text":"","title":"Auto Check Hazard Conflicts"},{"location":"cave/hazard-services-settings/#add-to-selected","text":"","title":"Add To Selected"},{"location":"cave/hazard-services-settings/#show-hatched-areas","text":"","title":"Show Hatched Areas"},{"location":"cave/hazard-services-settings/#change-vtec-mode","text":"","title":"Change VTEC Mode"},{"location":"cave/hazard-services-settings/#reset-events","text":"","title":"Reset Events"},{"location":"cave/hazard-services-settings/#hazard-event-table","text":"","title":"Hazard Event Table"},{"location":"cave/hazard-services-settings/#column-headers","text":"Non-Timeline Headers Timeline Header","title":"Column Headers"},{"location":"cave/hazard-services-settings/#table-rows","text":"","title":"Table Rows"},{"location":"cave/hazard-services-settings/#hazard-history","text":"","title":"Hazard History"},{"location":"cave/hazard-services-settings/#settings-overview","text":"","title":"Settings Overview"},{"location":"cave/hazard-services-settings/#settings-menu","text":"","title":"Settings Menu"},{"location":"cave/hazard-services-settings/#settings-dialog","text":"","title":"Settings Dialog"},{"location":"cave/hazard-services-settings/#hazards-filter-tab","text":"","title":"Hazards Filter Tab"},{"location":"cave/hazard-services-settings/#console-tab","text":"","title":"Console Tab"},{"location":"cave/hazard-services-settings/#console-coloring-tab","text":"","title":"Console Coloring Tab"},{"location":"cave/hazard-services-settings/#hidspatial-tab","text":"","title":"HID/Spatial Tab"},{"location":"cave/hazard-services-settings/#recommenders-tab","text":"","title":"Recommenders Tab"},{"location":"cave/hazard-services-settings/#mapsoverlays-tab","text":"","title":"Maps/Overlays Tab"},{"location":"cave/import-export/","text":"Import/Export \uf0c1 Export Images/GIFs \uf0c1 The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF. Note : This functionality does not currently work on Mac OS because it implements OGL libraries which are not compatible on Mac. Export KML \uf0c1 The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames, the current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in the Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be exported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black tiles when loaded in Google Earth. CAVE Import Formats \uf0c1 CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays CAVE Export Formats \uf0c1 CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...","title":"Import/Export"},{"location":"cave/import-export/#importexport","text":"","title":"Import/Export"},{"location":"cave/import-export/#export-imagesgifs","text":"The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF. Note : This functionality does not currently work on Mac OS because it implements OGL libraries which are not compatible on Mac.","title":"Export Images/GIFs"},{"location":"cave/import-export/#export-kml","text":"The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames, the current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in the Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be exported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black tiles when loaded in Google Earth.","title":"Export KML"},{"location":"cave/import-export/#cave-import-formats","text":"CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays","title":"CAVE Import Formats"},{"location":"cave/import-export/#cave-export-formats","text":"CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...","title":"CAVE Export Formats"},{"location":"cave/localization-perspective/","text":"Localization perspective \uf0c1 Localization Levels \uf0c1 AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base Localization Editor \uf0c1 The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor. Customizing CAVE Menus \uf0c1 Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.","title":"Localization Perspective"},{"location":"cave/localization-perspective/#localization-perspective","text":"","title":"Localization perspective"},{"location":"cave/localization-perspective/#localization-levels","text":"AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base","title":"Localization Levels"},{"location":"cave/localization-perspective/#localization-editor","text":"The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor.","title":"Localization Editor"},{"location":"cave/localization-perspective/#customizing-cave-menus","text":"Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.","title":"Customizing CAVE Menus"},{"location":"cave/maps-views-projections/","text":"Maps, Views, Projections \uf0c1 Default Map Scales \uf0c1 The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site) New Map Editor / View \uf0c1 Adding a New Map Editor \uf0c1 This can be done in two ways: using the file menu and right clicking on the tab bar. Using the file menu, simply go to: File > New Map . This opens a new map editor tab with the default projection (CONUS Polar Stereographic). To use the tab bar, right-click on or next to any tab and select New Editor Renaming Map Editor \uf0c1 Any of the map editor tabs can be renamed. This can be particularly helpful if you have multiple tabs, with a different focus on each (ie. different geographic reigon, different types of data, etc). New Projection \uf0c1 A new map projection can be created using the file menu: File > New Projection .","title":"Maps, Views, Projections"},{"location":"cave/maps-views-projections/#maps-views-projections","text":"","title":"Maps, Views, Projections"},{"location":"cave/maps-views-projections/#default-map-scales","text":"The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site)","title":"Default Map Scales"},{"location":"cave/maps-views-projections/#new-map-editor-view","text":"","title":"New Map Editor / View"},{"location":"cave/maps-views-projections/#adding-a-new-map-editor","text":"This can be done in two ways: using the file menu and right clicking on the tab bar. Using the file menu, simply go to: File > New Map . This opens a new map editor tab with the default projection (CONUS Polar Stereographic). To use the tab bar, right-click on or next to any tab and select New Editor","title":"Adding a New Map Editor"},{"location":"cave/maps-views-projections/#renaming-map-editor","text":"Any of the map editor tabs can be renamed. This can be particularly helpful if you have multiple tabs, with a different focus on each (ie. different geographic reigon, different types of data, etc).","title":"Renaming Map Editor"},{"location":"cave/maps-views-projections/#new-projection","text":"A new map projection can be created using the file menu: File > New Projection .","title":"New Projection"},{"location":"cave/ncp-perspective/","text":"The National Centers Perspective (NCP) \uf0c1 The NCP toolbar includes two buttons to load Data and Bundles , respectively. The toolbar also include a Clear button, Zoom and Unzoom , and the NSHARP plugin. Loading Data \uf0c1 Click the \" +Data \" button. Select Category , Source , Group , and Attributes Double-click the product, or select \" Add \" and the data will load to CAVE with the default number of frames (Note: this makes time-matching more difficult. For time-matching multiple products, load as a Bundle .) Latest Available Data Time or Cycle Time is underneath the Attributes column at bottom-right. Create a Bundle \uf0c1 Open the Resource Manager by: Click the \" +Bundle \" button on the toolbar Press the Spacebar Press the \" W \" key Click File -> New -> Bundle . Timeline \uf0c1 A timeline is displayed for available data. Here, the user may choose the dominant resource, number of frames, time range, reference time, etc. for the products to be displayed. Clicking \" Load \" will keep open the Resource Manager while the selected data layers are loaded to the map. \u201d Load and Close \u201d will display data and close the Resource Manager. Save a Bundle \uf0c1 In AWIPS II CAVE, Bundles are organized within the Resource Manager GUI. Steps in the Bundle creation process are prompted with new GUI windows that are specific to the operation taking place, as you will see below. Select resources for a Bundle (as in previous steps). Click the \" Save Bundle \". Select or type-in your desired Group Name and Bundle name and click \" Save Bundle \". After saving a Bundle, its a good idea to confirm that it loads correctly. Select \" Bundle \" -> \u201c Load Bundle \u201d to find your newly created Bundle. The \" Edit Bundle \" button is available to make any changes while loading. Manage Bundles \uf0c1 The third tab in the Resource Manager, titled Manage Bundles can be used to do just that: modify, create, and delete existing Bundle Groups. At the top left, there are 3 options: Modify Bundle Group , Create Bundle Group , and Delete Bundle Group . The user can change the order of the Bundles within the Bundle Group, by clicking the \" Move Up \" and \u201c Move Down \u201d buttons on the right. A user can add Bundles to an existing Bundle Group by clicking the \u201c Add Bundle \u201d button. A new Gui will pop up, allowing the user to select a Bundle that exists within a different Bundle Group or a current CAVE display. A Bundle may be renamed by clicking the \" Rename Bundle \" button. Similarly, an Bundle may be removed from a specific Bundle Group by clicking the \u201c Remove Bundle \u201d button. NOTE: any changes made here must be saved by clicking the \u201c Save Bundle Group \u201d button on the left-hand side. Deleting an Bundle Group is a fairly straightforward action. First, click the \" Delete Bundle Group \" option on the top-left, then select the Bundle Group Group and Name to be deleted. Edit Data Sources \uf0c1 Selection a Resource to edit allows you to update the number of frames, frame span, range and timeline form. The plugin name and grid name ( GDFILE ) can also be edited. Edit Resource Attributes \uf0c1 Using gridded data, selecting an Attribute to edit allows you to change the GEMPAK syntax used to define the resource. Add a New Grid \uf0c1 Click the \" Bundle \" button and then open the \u201c Manage Data \u201d tab. Select the category (we will use GRID ). Select a model to copy as a template. In this example we select the base \" NAM-12km \" model. Click the \" Copy \" button underneath the GRID column. You can edit the new resource under \" Edit Resource Type \". Choose a name for the new resource (e.g. WRF ) In \" Edit Resource Parameters \", change the \u201c GDFILE= \u201d definition to match the name of the new model in the database (In this case we change GDFILE=NAM to GDFILE=WRF) . Click \" Create \" at the bottom of the window to finish. The new Resource now displays with a ( U ) next to the name, signifying a user-created item. In Attribute Groups , you can add attributes to a resource by clicking \" Edit \". Select the desired Attribute Set and click \" Add -> \" to add it to the right column (You can hold the Ctrl key and select multiple Attributes.) Click \" Save \" and then \u201c Ok \u201d. In the \" Create Bundle \" tab, click \u201c New \u201d to see the new Resource. Multi-Pane Display \uf0c1 The NCP includes a configurable multi-pane display. As seen in the figure below, selecting the \"Multi-Pane\" check box extends the GUI window and displays additional options. Selecting the \" Multi-Pane Display \" checkbox enables the multi-pane builder. This new feature allows you to customize the number of panes you would like to display in AWIPS II CAVE. The \"Select Pane\" portion of the GUI allows you to load different products into each pane, which includes importing previously created bundles. Here are a few quick steps to creating a Multi-pane display in AWIPS II: Click the Multi-Pane checkbox in the Resource Manager Select the number of Rows and Columns you would like your data display to contain Select the precise pane in which you would like a specific product (i.e. Row 1, Column1) Choose a product through the Add button (See Data Selection above) Select a different cell in your multi-paned display in which you would like to display a product the user will need to load a separate product from the Resource Manager for each pane in the select pane layout Repeat step #4 Repeat the previous steps, until all of your panes have products queued up inside. Click \" Load \" and your multi-paned display will appear Load Multiple Bundles \uf0c1 The Load Bundle tab in the Resource Manager can be used to load Bundles previously created by the user: The user should select name of the Group in which the desired Bundle is housed. After doing so, a list of available Bundles will appear in the centrally located \"Bundles\" pane. Selecting a Bundle will populate the pane on the right, which displays the contents of each Bundle, and also provides information on its Localization settings. Clicking \"Load\" or \u201cLoad and Close\u201d at the bottom of this window will load the saved Bundle. Before doing so, you can adjust things like Frames, dominant resources, time range, etc. in the \u201c Select Timeline \u201d section at the bottom of the window. Multiple Bundles can be selected and loaded all at once by simply hold the Ctl key and multi-selecting Bundles from the central pane, and then clicking either of the Load buttons. If multiple Bundles are loaded at once, they will each be displayed in different tabs in the CAVE interface. The order/arrangement of the Bundles will be mimicked in the order of the tabs when displayed in CAVE. Finally, the user may also edit an Bundle in this tab, simply by clicking the \" Edit \" button, and making desired changes in the GUI that pops up.","title":"The National Centers Perspective (NCP)"},{"location":"cave/ncp-perspective/#the-national-centers-perspective-ncp","text":"The NCP toolbar includes two buttons to load Data and Bundles , respectively. The toolbar also include a Clear button, Zoom and Unzoom , and the NSHARP plugin.","title":"The National Centers Perspective (NCP)"},{"location":"cave/ncp-perspective/#loading-data","text":"Click the \" +Data \" button. Select Category , Source , Group , and Attributes Double-click the product, or select \" Add \" and the data will load to CAVE with the default number of frames (Note: this makes time-matching more difficult. For time-matching multiple products, load as a Bundle .) Latest Available Data Time or Cycle Time is underneath the Attributes column at bottom-right.","title":"Loading Data"},{"location":"cave/ncp-perspective/#create-a-bundle","text":"Open the Resource Manager by: Click the \" +Bundle \" button on the toolbar Press the Spacebar Press the \" W \" key Click File -> New -> Bundle .","title":"Create a Bundle"},{"location":"cave/ncp-perspective/#timeline","text":"A timeline is displayed for available data. Here, the user may choose the dominant resource, number of frames, time range, reference time, etc. for the products to be displayed. Clicking \" Load \" will keep open the Resource Manager while the selected data layers are loaded to the map. \u201d Load and Close \u201d will display data and close the Resource Manager.","title":"Timeline"},{"location":"cave/ncp-perspective/#save-a-bundle","text":"In AWIPS II CAVE, Bundles are organized within the Resource Manager GUI. Steps in the Bundle creation process are prompted with new GUI windows that are specific to the operation taking place, as you will see below. Select resources for a Bundle (as in previous steps). Click the \" Save Bundle \". Select or type-in your desired Group Name and Bundle name and click \" Save Bundle \". After saving a Bundle, its a good idea to confirm that it loads correctly. Select \" Bundle \" -> \u201c Load Bundle \u201d to find your newly created Bundle. The \" Edit Bundle \" button is available to make any changes while loading.","title":"Save a Bundle"},{"location":"cave/ncp-perspective/#manage-bundles","text":"The third tab in the Resource Manager, titled Manage Bundles can be used to do just that: modify, create, and delete existing Bundle Groups. At the top left, there are 3 options: Modify Bundle Group , Create Bundle Group , and Delete Bundle Group . The user can change the order of the Bundles within the Bundle Group, by clicking the \" Move Up \" and \u201c Move Down \u201d buttons on the right. A user can add Bundles to an existing Bundle Group by clicking the \u201c Add Bundle \u201d button. A new Gui will pop up, allowing the user to select a Bundle that exists within a different Bundle Group or a current CAVE display. A Bundle may be renamed by clicking the \" Rename Bundle \" button. Similarly, an Bundle may be removed from a specific Bundle Group by clicking the \u201c Remove Bundle \u201d button. NOTE: any changes made here must be saved by clicking the \u201c Save Bundle Group \u201d button on the left-hand side. Deleting an Bundle Group is a fairly straightforward action. First, click the \" Delete Bundle Group \" option on the top-left, then select the Bundle Group Group and Name to be deleted.","title":"Manage Bundles"},{"location":"cave/ncp-perspective/#edit-data-sources","text":"Selection a Resource to edit allows you to update the number of frames, frame span, range and timeline form. The plugin name and grid name ( GDFILE ) can also be edited.","title":"Edit Data Sources"},{"location":"cave/ncp-perspective/#edit-resource-attributes","text":"Using gridded data, selecting an Attribute to edit allows you to change the GEMPAK syntax used to define the resource.","title":"Edit Resource Attributes"},{"location":"cave/ncp-perspective/#add-a-new-grid","text":"Click the \" Bundle \" button and then open the \u201c Manage Data \u201d tab. Select the category (we will use GRID ). Select a model to copy as a template. In this example we select the base \" NAM-12km \" model. Click the \" Copy \" button underneath the GRID column. You can edit the new resource under \" Edit Resource Type \". Choose a name for the new resource (e.g. WRF ) In \" Edit Resource Parameters \", change the \u201c GDFILE= \u201d definition to match the name of the new model in the database (In this case we change GDFILE=NAM to GDFILE=WRF) . Click \" Create \" at the bottom of the window to finish. The new Resource now displays with a ( U ) next to the name, signifying a user-created item. In Attribute Groups , you can add attributes to a resource by clicking \" Edit \". Select the desired Attribute Set and click \" Add -> \" to add it to the right column (You can hold the Ctrl key and select multiple Attributes.) Click \" Save \" and then \u201c Ok \u201d. In the \" Create Bundle \" tab, click \u201c New \u201d to see the new Resource.","title":"Add a New Grid"},{"location":"cave/ncp-perspective/#multi-pane-display","text":"The NCP includes a configurable multi-pane display. As seen in the figure below, selecting the \"Multi-Pane\" check box extends the GUI window and displays additional options. Selecting the \" Multi-Pane Display \" checkbox enables the multi-pane builder. This new feature allows you to customize the number of panes you would like to display in AWIPS II CAVE. The \"Select Pane\" portion of the GUI allows you to load different products into each pane, which includes importing previously created bundles. Here are a few quick steps to creating a Multi-pane display in AWIPS II: Click the Multi-Pane checkbox in the Resource Manager Select the number of Rows and Columns you would like your data display to contain Select the precise pane in which you would like a specific product (i.e. Row 1, Column1) Choose a product through the Add button (See Data Selection above) Select a different cell in your multi-paned display in which you would like to display a product the user will need to load a separate product from the Resource Manager for each pane in the select pane layout Repeat step #4 Repeat the previous steps, until all of your panes have products queued up inside. Click \" Load \" and your multi-paned display will appear","title":"Multi-Pane Display"},{"location":"cave/ncp-perspective/#load-multiple-bundles","text":"The Load Bundle tab in the Resource Manager can be used to load Bundles previously created by the user: The user should select name of the Group in which the desired Bundle is housed. After doing so, a list of available Bundles will appear in the centrally located \"Bundles\" pane. Selecting a Bundle will populate the pane on the right, which displays the contents of each Bundle, and also provides information on its Localization settings. Clicking \"Load\" or \u201cLoad and Close\u201d at the bottom of this window will load the saved Bundle. Before doing so, you can adjust things like Frames, dominant resources, time range, etc. in the \u201c Select Timeline \u201d section at the bottom of the window. Multiple Bundles can be selected and loaded all at once by simply hold the Ctl key and multi-selecting Bundles from the central pane, and then clicking either of the Load buttons. If multiple Bundles are loaded at once, they will each be displayed in different tabs in the CAVE interface. The order/arrangement of the Bundles will be mimicked in the order of the tabs when displayed in CAVE. Finally, the user may also edit an Bundle in this tab, simply by clicking the \" Edit \" button, and making desired changes in the GUI that pops up.","title":"Load Multiple Bundles"},{"location":"cave/nsharp/","text":"NSHARP \uf0c1 NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source that has data (signified by a green box to the right) Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display NSHARP Configurations \uf0c1 NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview . Skew-T Display \uf0c1 The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display. Windspeed vs Height and Inferred Temperature Advection \uf0c1 The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information. Hodograph Display \uf0c1 This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display. Insets \uf0c1 In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information. Table Output Displays \uf0c1 The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available. Graphs/Statistics \uf0c1 In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information. Sounding Inventory \uf0c1 This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.","title":"NSHARP"},{"location":"cave/nsharp/#nsharp","text":"NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source that has data (signified by a green box to the right) Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display","title":"NSHARP"},{"location":"cave/nsharp/#nsharp-configurations","text":"NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview .","title":"NSHARP Configurations"},{"location":"cave/nsharp/#skew-t-display","text":"The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display.","title":"Skew-T Display"},{"location":"cave/nsharp/#windspeed-vs-height-and-inferred-temperature-advection","text":"The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information.","title":"Windspeed vs Height and Inferred Temperature Advection"},{"location":"cave/nsharp/#hodograph-display","text":"This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display.","title":"Hodograph Display"},{"location":"cave/nsharp/#insets","text":"In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information.","title":"Insets"},{"location":"cave/nsharp/#table-output-displays","text":"The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available.","title":"Table Output Displays"},{"location":"cave/nsharp/#graphsstatistics","text":"In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information.","title":"Graphs/Statistics"},{"location":"cave/nsharp/#sounding-inventory","text":"This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.","title":"Sounding Inventory"},{"location":"cave/pgen/","text":"Product Generator (PGEN) \uf0c1 The National Centers use PGEN to draw annotations and generate all their products, and it is included in D-2D to support Center Weather Service Units (CWSUs) making AWC-style SIGMETs. While this is not intended to be used for other purposes, there are a number of unique drawing and annotation tools that can be used to make images using the CAVE->export->Image once a display has been created.","title":"Product Generator (PGEN)"},{"location":"cave/pgen/#product-generator-pgen","text":"The National Centers use PGEN to draw annotations and generate all their products, and it is included in D-2D to support Center Weather Service Units (CWSUs) making AWC-style SIGMETs. While this is not intended to be used for other purposes, there are a number of unique drawing and annotation tools that can be used to make images using the CAVE->export->Image once a display has been created.","title":"Product Generator (PGEN)"},{"location":"cave/unused-components/","text":"An overview of components that are used operationally by the NWS but are made inactive in the Unidata release. Some components are impractical for non-operational use, and some are unavailable for distribution outside of the NWS. Data Delivery \uf0c1 The \"Data Delivery\" option opens the Data Delivery application. Data Delivery is a permission-based application, meaning that the System Manager or User Administrator controls the user's access to the Data Delivery functionalities. If granted permission to access this application, Data Delivery allows a user to subscribe to a data source or create an ad hoc request and have the data delivered in near real time. Whether delivered by subscription or in response to an ad hoc request, the data can be tailored to a user's specific temporal, geographic, and parameter needs. For a detailed description of the Data Delivery application, refer to Section 16.1. Collaboration \uf0c1 The \"Collaboration\" option offers two main functions: chatting and sharing displays. Chat allows users to send and receive instant messages or chat with fellow forecasters and offices in a chat room. Sharing displays adds to the chat room capabilities and allows the room's creator to show a CAVE map display to other participants in the room. For a detailed description of \"Collaboration\" and information on how to create a chat session and share displays, refer to Section 16.2. Archive Case Creation \uf0c1 The \"Archive Case Creation\" option is a component of the AWIPS-2 Archiver application. The archiver application is a permission-based functionality. It allows a user to extract stored weather event data and copy it into a user-defined directory to be archived (e.g., burned to a DVD). The archived data can later be played back for simulation of weather events using the WES-2 Bridge. For a detailed description of the AWIPS-2 Archiver application and the \"Archive Case Creation\" component, refer to Section 16.3. Archive Retention: The \"Archive Retention\" option is a component of the AWIPS-2 Archiver application. The archiver retention functionality and its purge component, which runs on EDEX, are permission-based functionalities. Access to the \"Archive Retention\" option is limited to User Administrators and users identified as a database/purge focal point. More information on these AWIPS-2 Archiver application functionalities are provided in the System Manager's Manual. AWIPS User Administration \uf0c1 Some of the functionalities of certain CAVE applications (currently, Data Delivery and Localization) are reserved for designated users. User Administrators choose the \"AWIPS User Administration\" option to access the screens they use to set permissions and roles for the reserved functions. Access to the \"AWIPS User Administration\" option is limited to User Administrators. Other users who select this option will be denied access and receive the Alert Message shown in Exhibit 2.2.6.1-5. More information on AWIPS User Administration is provided in the System Manager's Manual. LDAD (Local Data Acquisition and Dissemination) \uf0c1 The LDAD system provides the means to acquire local data sets, perform quality control on the incoming data, and disseminate weather data to the external user community. It contains a number of components that reside on the internal AWIPS network and on the external LDAD component (on the LDAD server cluster). The internal and external components at WFOs are separated by a security firewall. The basic LDAD concept simplifies this process for both the data providers and for the support team. LDAD uses a simple data format, ASCII Comma Separated Values Text (CSVText), which separates each data field by a comma. A set of metadata files, created and maintained by the data provider or in conjunction with site personnel, will be used by the acquisition decoder. This facilitates data processing in hydrometeorological units instead of sensor units and removes the need for conversion routines. The simplicity of the CSVText format increases the likelihood that the data provider will use this standardized format. All LDAD acquisition data are categorized and stored into the following four classes: * Mesonet for surface weather observations * Hydro for rain and stream observations * Manual for manual observations such as cooperative observers * Upper air for multilevel observations such as profilers. The LDAD functionality supports the acquisition of the Integrated Flood Observing and Warning System (IFLOWS); ALERT; Mesonet; Profiler; RRS/Upper Air; Gauges (LARC, Handar, Campbell, Sutron); COOP (Co-operative Observations); and other data transported via LDM, Rsync, or other TCP/IP Protocols. The Data Acquisition function is achieved when data is transmitted to the internal (trusted) AWIPS servers. The data is transmitted to and from the LDAD Cluster via TCP/IP protocols or RS-232 communications. The Data Dissemination function is achieved when data is transmitted to the LDAD Cluster from the internal AWIPS system and is then distributed to External Users. The data can be acquired, stored, and displayed once fully configured. The LDAD System consists of two LDAD servers (LS2/3), a LAN switch (SMC 8024), a Terminal Server (Cyclades ACS32), Modems (MultiTech MT5600BR), and a LAN DMZ (HP ProCurve 2824). The DMZ consists of two SSG 320M Firewalls, a Netgear 16 switch, and two Netgear 5 port hubs. The LDAD baseline processes run on the LDAD Cluster (DMZ) and the AWIPS PX Cluster (Internal). Other local applications may run on other internal clusters, such as DX cluster in the case of the LDAD Dissemination Server. Data is transmitted through the DMZ either to the Trusted (internal) AWIPS system or to the Untrusted (External) Users/Systems.","title":"Unused components"},{"location":"cave/unused-components/#data-delivery","text":"The \"Data Delivery\" option opens the Data Delivery application. Data Delivery is a permission-based application, meaning that the System Manager or User Administrator controls the user's access to the Data Delivery functionalities. If granted permission to access this application, Data Delivery allows a user to subscribe to a data source or create an ad hoc request and have the data delivered in near real time. Whether delivered by subscription or in response to an ad hoc request, the data can be tailored to a user's specific temporal, geographic, and parameter needs. For a detailed description of the Data Delivery application, refer to Section 16.1.","title":"Data Delivery"},{"location":"cave/unused-components/#collaboration","text":"The \"Collaboration\" option offers two main functions: chatting and sharing displays. Chat allows users to send and receive instant messages or chat with fellow forecasters and offices in a chat room. Sharing displays adds to the chat room capabilities and allows the room's creator to show a CAVE map display to other participants in the room. For a detailed description of \"Collaboration\" and information on how to create a chat session and share displays, refer to Section 16.2.","title":"Collaboration"},{"location":"cave/unused-components/#archive-case-creation","text":"The \"Archive Case Creation\" option is a component of the AWIPS-2 Archiver application. The archiver application is a permission-based functionality. It allows a user to extract stored weather event data and copy it into a user-defined directory to be archived (e.g., burned to a DVD). The archived data can later be played back for simulation of weather events using the WES-2 Bridge. For a detailed description of the AWIPS-2 Archiver application and the \"Archive Case Creation\" component, refer to Section 16.3. Archive Retention: The \"Archive Retention\" option is a component of the AWIPS-2 Archiver application. The archiver retention functionality and its purge component, which runs on EDEX, are permission-based functionalities. Access to the \"Archive Retention\" option is limited to User Administrators and users identified as a database/purge focal point. More information on these AWIPS-2 Archiver application functionalities are provided in the System Manager's Manual.","title":"Archive Case Creation"},{"location":"cave/unused-components/#awips-user-administration","text":"Some of the functionalities of certain CAVE applications (currently, Data Delivery and Localization) are reserved for designated users. User Administrators choose the \"AWIPS User Administration\" option to access the screens they use to set permissions and roles for the reserved functions. Access to the \"AWIPS User Administration\" option is limited to User Administrators. Other users who select this option will be denied access and receive the Alert Message shown in Exhibit 2.2.6.1-5. More information on AWIPS User Administration is provided in the System Manager's Manual.","title":"AWIPS User Administration"},{"location":"cave/unused-components/#ldad-local-data-acquisition-and-dissemination","text":"The LDAD system provides the means to acquire local data sets, perform quality control on the incoming data, and disseminate weather data to the external user community. It contains a number of components that reside on the internal AWIPS network and on the external LDAD component (on the LDAD server cluster). The internal and external components at WFOs are separated by a security firewall. The basic LDAD concept simplifies this process for both the data providers and for the support team. LDAD uses a simple data format, ASCII Comma Separated Values Text (CSVText), which separates each data field by a comma. A set of metadata files, created and maintained by the data provider or in conjunction with site personnel, will be used by the acquisition decoder. This facilitates data processing in hydrometeorological units instead of sensor units and removes the need for conversion routines. The simplicity of the CSVText format increases the likelihood that the data provider will use this standardized format. All LDAD acquisition data are categorized and stored into the following four classes: * Mesonet for surface weather observations * Hydro for rain and stream observations * Manual for manual observations such as cooperative observers * Upper air for multilevel observations such as profilers. The LDAD functionality supports the acquisition of the Integrated Flood Observing and Warning System (IFLOWS); ALERT; Mesonet; Profiler; RRS/Upper Air; Gauges (LARC, Handar, Campbell, Sutron); COOP (Co-operative Observations); and other data transported via LDM, Rsync, or other TCP/IP Protocols. The Data Acquisition function is achieved when data is transmitted to the internal (trusted) AWIPS servers. The data is transmitted to and from the LDAD Cluster via TCP/IP protocols or RS-232 communications. The Data Dissemination function is achieved when data is transmitted to the LDAD Cluster from the internal AWIPS system and is then distributed to External Users. The data can be acquired, stored, and displayed once fully configured. The LDAD System consists of two LDAD servers (LS2/3), a LAN switch (SMC 8024), a Terminal Server (Cyclades ACS32), Modems (MultiTech MT5600BR), and a LAN DMZ (HP ProCurve 2824). The DMZ consists of two SSG 320M Firewalls, a Netgear 16 switch, and two Netgear 5 port hubs. The LDAD baseline processes run on the LDAD Cluster (DMZ) and the AWIPS PX Cluster (Internal). Other local applications may run on other internal clusters, such as DX cluster in the case of the LDAD Dissemination Server. Data is transmitted through the DMZ either to the Trusted (internal) AWIPS system or to the Untrusted (External) Users/Systems.","title":"LDAD (Local Data Acquisition and Dissemination)"},{"location":"cave/warngen/","text":"WarnGen Walkthrough \uf0c1 WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages) Quick Steps - Using WarnGen in Unidata AWIPS CAVE \uf0c1 Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (PAH is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kpah > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Note: Since you are not \"issuing\" the warning, leave the top to rows blank (\"TTAAii\" and \"CCCC\") and Click \"Enter\" and a separate text window should open Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon Video - Using WarnGen in AWIPS \uf0c1 The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video. Load NEXRAD level 3 display \uf0c1 Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (PAH Paducah, Kentucky, in this example). Select SITE Localization \uf0c1 Open CAVE > Preferences > Localization , select the CWA site ID (PAH) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window. Load single radar data from the local radars \uf0c1 Click on the local radar kpah > Z + SRM8 . Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM. Launch WarnGen \uf0c1 Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window. Generate a Storm Motion Vector \uf0c1 Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits. Moving Vertex Points \uf0c1 Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream. Add and Remove Vertex Points \uf0c1 There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex . Redrawing a Polygon \uf0c1 Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon. Text Window \uf0c1 Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Leave the top two rows bank and click Enter for the text window to open. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.","title":"WarnGen Walkthrough"},{"location":"cave/warngen/#warngen-walkthrough","text":"WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages)","title":"WarnGen Walkthrough"},{"location":"cave/warngen/#quick-steps-using-warngen-in-unidata-awips-cave","text":"Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (PAH is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kpah > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Note: Since you are not \"issuing\" the warning, leave the top to rows blank (\"TTAAii\" and \"CCCC\") and Click \"Enter\" and a separate text window should open Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon","title":"Quick Steps - Using WarnGen in Unidata AWIPS CAVE"},{"location":"cave/warngen/#video-using-warngen-in-awips","text":"The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video.","title":"Video - Using WarnGen in AWIPS"},{"location":"cave/warngen/#load-nexrad-level-3-display","text":"Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (PAH Paducah, Kentucky, in this example).","title":"Load NEXRAD level 3 display"},{"location":"cave/warngen/#select-site-localization","text":"Open CAVE > Preferences > Localization , select the CWA site ID (PAH) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window.","title":"Select SITE Localization"},{"location":"cave/warngen/#load-single-radar-data-from-the-local-radars","text":"Click on the local radar kpah > Z + SRM8 . Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM.","title":"Load single radar data from the local radars"},{"location":"cave/warngen/#launch-warngen","text":"Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window.","title":"Launch WarnGen"},{"location":"cave/warngen/#generate-a-storm-motion-vector","text":"Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits.","title":"Generate a Storm Motion Vector"},{"location":"cave/warngen/#moving-vertex-points","text":"Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream.","title":"Moving Vertex Points"},{"location":"cave/warngen/#add-and-remove-vertex-points","text":"There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex .","title":"Add and Remove Vertex Points"},{"location":"cave/warngen/#redrawing-a-polygon","text":"Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon.","title":"Redrawing a Polygon"},{"location":"cave/warngen/#text-window","text":"Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Leave the top two rows bank and click Enter for the text window to open. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.","title":"Text Window"},{"location":"dev/awips-development-environment/","text":"AWIPS Development Environment (ADE) \uf0c1 Detailed instructions on how to download the latest source code and run CAVE from Eclipse. It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS. The following yum commands listed in these instructions may need to be run as the root user, but the rest of the commands should be run as the local user. 1. Remove AWIPS Instances \uf0c1 First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all yum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2 2. Set Up AWIPS Repo \uf0c1 Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: sudo vi /etc/yum.repos.d/awips2.repo [awips2repo] name=AWIPS II Repository baseurl=https://downloads.unidata.ucar.edu/awips2/current/linux/rpms/ el7-dev/ enabled=1 protect=0 gpgcheck=0 proxy=_none_ This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl. 3. Install the ADE \uf0c1 Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all yum groupinstall awips2-ade Check the libGLU package is installed by running rpm -qa | grep mesa-libGLU . If nothing is returned, install the package via: yum install mesa-libGLU . 4. Download the Source Code \uf0c1 If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git git clone https://github.com/Unidata/awips2-cimss.git git clone https://github.com/Unidata/awips2-core.git git clone https://github.com/Unidata/awips2-core-foss.git git clone https://github.com/Unidata/awips2-drawing.git git clone https://github.com/Unidata/awips2-foss.git git clone https://github.com/Unidata/awips2-goesr.git git clone https://github.com/Unidata/awips2-gsd.git git clone https://github.com/Unidata/awips2-ncep.git git clone https://github.com/Unidata/awips2-nws.git Make sure to run git checkout in each repo if you'd wish to develop from a branch different from the default. It's best to do this before importing the repos into eclipse. 5. Configure Eclipse \uf0c1 Open eclipse by running: /awips2/eclipse/eclipse It is fine to choose the default workspace upon starting up. Set Preferences \uf0c1 Verify or make the following changes to set up eclipse for AWIPS development: Window > Preferences > Java > Installed JREs Set to /awips2/java Window > Preferences > PyDev > Interpreters > Python Interpreter Set to /awips2/python/bin/python Note: Add all paths to the SYSTEM pythonpath if prompted There might be some unresolved errors. These should be made to warnings instead. Window > Preferences > Java > Compiler > Building > Build path Problems > Circular Dependencies > Change to Warning Window > Preferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\" Importing Git Repos \uf0c1 All of the git repos that were cloned in the previous step will need to be imported into Eclipse. But, be aware the awips2 repo is done last, because it requires different steps. File > Import > Git > Projects from Git > Next Continue with the default selection, Existing local repository > Add.. > add each of the git repos (for example .../awips2-core ) > check the checkbox > Finish Then for each of the repos (except awips2 right now): Select the repo name > Next > Continue with default selection (Working Tree) > Next > Continue with default selections (all choices selected) > Finish Finally, for awips2 repo, follow all the above steps except in the Working Tree, only select: cave > Next > Finish edexOsgi > Next > Finish Final Setup \uf0c1 Project > Clean > OK Use default selections: Clean all projects , Start a build immediately , Build the entire workspace Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\" 6. Run CAVE \uf0c1 CAVE can be ran from eclipse by using the com.raytheon.viz.product.awips/developer.product Double-click the developer.product file to open the Project Explorer in Eclipse. Select Overview > Synchronize Use the Project Explorer on the left-hand side of eclipse to run CAVE as a Java application or in Debug mode : Run Application \uf0c1 Select Run As > Eclipse Application Debug Application \uf0c1 Select Debug > Eclipse Application Troubleshooting \uf0c1 If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild. Window > Preferences > Java > Compiler > Compiler compliance level setting","title":"Development"},{"location":"dev/awips-development-environment/#awips-development-environment-ade","text":"Detailed instructions on how to download the latest source code and run CAVE from Eclipse. It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS. The following yum commands listed in these instructions may need to be run as the root user, but the rest of the commands should be run as the local user.","title":"AWIPS Development Environment (ADE)"},{"location":"dev/awips-development-environment/#1-remove-awips-instances","text":"First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all yum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2","title":"1. Remove AWIPS Instances"},{"location":"dev/awips-development-environment/#2-set-up-awips-repo","text":"Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: sudo vi /etc/yum.repos.d/awips2.repo [awips2repo] name=AWIPS II Repository baseurl=https://downloads.unidata.ucar.edu/awips2/current/linux/rpms/ el7-dev/ enabled=1 protect=0 gpgcheck=0 proxy=_none_ This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl.","title":"2. Set Up AWIPS Repo"},{"location":"dev/awips-development-environment/#3-install-the-ade","text":"Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all yum groupinstall awips2-ade Check the libGLU package is installed by running rpm -qa | grep mesa-libGLU . If nothing is returned, install the package via: yum install mesa-libGLU .","title":"3. Install the ADE"},{"location":"dev/awips-development-environment/#4-download-the-source-code","text":"If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git git clone https://github.com/Unidata/awips2-cimss.git git clone https://github.com/Unidata/awips2-core.git git clone https://github.com/Unidata/awips2-core-foss.git git clone https://github.com/Unidata/awips2-drawing.git git clone https://github.com/Unidata/awips2-foss.git git clone https://github.com/Unidata/awips2-goesr.git git clone https://github.com/Unidata/awips2-gsd.git git clone https://github.com/Unidata/awips2-ncep.git git clone https://github.com/Unidata/awips2-nws.git Make sure to run git checkout in each repo if you'd wish to develop from a branch different from the default. It's best to do this before importing the repos into eclipse.","title":"4. Download the Source Code"},{"location":"dev/awips-development-environment/#5-configure-eclipse","text":"Open eclipse by running: /awips2/eclipse/eclipse It is fine to choose the default workspace upon starting up.","title":"5. Configure Eclipse"},{"location":"dev/awips-development-environment/#set-preferences","text":"Verify or make the following changes to set up eclipse for AWIPS development: Window > Preferences > Java > Installed JREs Set to /awips2/java Window > Preferences > PyDev > Interpreters > Python Interpreter Set to /awips2/python/bin/python Note: Add all paths to the SYSTEM pythonpath if prompted There might be some unresolved errors. These should be made to warnings instead. Window > Preferences > Java > Compiler > Building > Build path Problems > Circular Dependencies > Change to Warning Window > Preferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\"","title":"Set Preferences"},{"location":"dev/awips-development-environment/#importing-git-repos","text":"All of the git repos that were cloned in the previous step will need to be imported into Eclipse. But, be aware the awips2 repo is done last, because it requires different steps. File > Import > Git > Projects from Git > Next Continue with the default selection, Existing local repository > Add.. > add each of the git repos (for example .../awips2-core ) > check the checkbox > Finish Then for each of the repos (except awips2 right now): Select the repo name > Next > Continue with default selection (Working Tree) > Next > Continue with default selections (all choices selected) > Finish Finally, for awips2 repo, follow all the above steps except in the Working Tree, only select: cave > Next > Finish edexOsgi > Next > Finish","title":"Importing Git Repos"},{"location":"dev/awips-development-environment/#final-setup","text":"Project > Clean > OK Use default selections: Clean all projects , Start a build immediately , Build the entire workspace Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\"","title":"Final Setup"},{"location":"dev/awips-development-environment/#6-run-cave","text":"CAVE can be ran from eclipse by using the com.raytheon.viz.product.awips/developer.product Double-click the developer.product file to open the Project Explorer in Eclipse. Select Overview > Synchronize Use the Project Explorer on the left-hand side of eclipse to run CAVE as a Java application or in Debug mode :","title":"6. Run CAVE"},{"location":"dev/awips-development-environment/#run-application","text":"Select Run As > Eclipse Application","title":"Run Application"},{"location":"dev/awips-development-environment/#debug-application","text":"Select Debug > Eclipse Application","title":"Debug Application"},{"location":"dev/awips-development-environment/#troubleshooting","text":"If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild. Window > Preferences > Java > Compiler > Compiler compliance level setting","title":"Troubleshooting"},{"location":"dev/build-datadelivery/","text":"Data Delivery has been implemented into the AWIPS(II) baseline to provide access to data that is not resident locally at a Weather Forecast Office, River Forecast Center, or National Center. Data Delivery gives users the ability to create queries (One Time Requests) and subscriptions to data sets (provided OGC / OpenDAP servers such as THREDDS). build.edex/build.xml \uf0c1 ... Notice the last two commented out, com.raytheon.uf.edex.datadelivery.feature and com.raytheon.uf.edex.ogc.feature . These feature sets do not exist , but could easily be created in the same wat as other features (like com.raytheon.uf.common.base.feature , com.raytheon.uf.edex.base.feature , etc. wa-build \uf0c1 The source code comments provide the following guidance: In the work assignment's edexOsgi/build.edex directory, create a file named similar to the following: edexOsgi/build.edex/5-Data_Delivery-wa-build.properties In the file, there should be one line such as: wa.features=feature1,feature2 However, the wa-build Ant target requires a file features.txt exist. So if is 5-Data_Delivery-wa-build.properties or features.txt ? Because the delimiter being specified is a line separator (and not a comma \"wa.features=feature1,feature2\" as with versions proir to 16.2.2). So we can infer that a file should exist called features.txt should exist which has one WA feature per line. And what do you know, a similar file exist for the CAVE build in awips2-builds/cave/build/features.txt : cat awips2-builds/cave/build/features.txt com.raytheon.uf.common.base.feature com.raytheon.uf.viz.dataplugin.obs.feature ... ","title":"Build datadelivery"},{"location":"dev/build-datadelivery/#buildedexbuildxml","text":" ... Notice the last two commented out, com.raytheon.uf.edex.datadelivery.feature and com.raytheon.uf.edex.ogc.feature . These feature sets do not exist , but could easily be created in the same wat as other features (like com.raytheon.uf.common.base.feature , com.raytheon.uf.edex.base.feature , etc.","title":"build.edex/build.xml"},{"location":"dev/build-datadelivery/#wa-build","text":"The source code comments provide the following guidance: In the work assignment's edexOsgi/build.edex directory, create a file named similar to the following: edexOsgi/build.edex/5-Data_Delivery-wa-build.properties In the file, there should be one line such as: wa.features=feature1,feature2 However, the wa-build Ant target requires a file features.txt exist. So if is 5-Data_Delivery-wa-build.properties or features.txt ? Because the delimiter being specified is a line separator (and not a comma \"wa.features=feature1,feature2\" as with versions proir to 16.2.2). So we can infer that a file should exist called features.txt should exist which has one WA feature per line. And what do you know, a similar file exist for the CAVE build in awips2-builds/cave/build/features.txt : cat awips2-builds/cave/build/features.txt com.raytheon.uf.common.base.feature com.raytheon.uf.viz.dataplugin.obs.feature ... ","title":"wa-build"},{"location":"devguide/data-flow/","text":"Data Receipt \uf0c1 The LDM obtains a data product from an upstream LDM site on the IDD. The LDM writes the data to file in Raw Data Storage. The LDM uses edexBridge to post a \u201cdata available\u201d message to the Qpid message broker. The EDEX Ingest process obtains the \u201cdata available\u201d message from Qpid and removes the message from the message queue. The EDEX Ingest process obtains the data files from Raw Data Storage. This architecture provides separation between data sources and ingest processing. Any data source, not just the LDM/IDD, can follow this architecture to provide data for EDEX to process. Data Decoding \uf0c1 Data decoding is defined as the process of converting data from a raw format into a decoded format that is usable by CAVE. In AWIPS, data decoding is performed by the EDEX Ingest proessing ( ingest and ingestGrib ). EDEX Ingest obtains the \u201cdata available\u201d message from the Qpid message broker, and determines the appropriate data decoder based on the message contents. EDEX Ingest then forwards the message to the chosen decoder. Finally, the message is removed from the message queue. EDEX Ingest reads the data from Raw Data Storage. EDEX Ingest decodes the data. EDEX Ingest forwards the decoded data to Processed Data Storage. EDEX Ingest sends a message to the CAVE client indicating that newly-decoded data is available. It is important to note that in AWIPS all data types are processed by either the standard ingest process, or by the ingestGrib process, which handles all grib message ingestion. Once this data decoding process is complete, clients may obtain and perform additional processing on the data, including displaying data in CAVE. Processed Data Storage Architecture \uf0c1 Processed Data Storage refers to the persistence of decoded data and occurs in two separate forms: 1) metadata and some decoded data, which is stored in Postgres database tables; and 2) imagery data, gridded forecast data, and certain point data, which is stored in HDF5 files, and is managed by PyPIES. Writing to Processed Data Storage involves the following: 1) The EDEX Process sends serialized data, area data, and certain point data to PyPIES. 2) PyPIES writes the data to the HDF5 data store. 3) EDEX send the metadata to the Postgres DBMS 4) Postgres writes the metadata to the AWIPS database. For data not stored in HDF5, Steps 1 and 2 are skipped. For processed data retrieval, the process is revered: 3) EDEX requests the metadata from Postgres. 4) Postgres reads the AWIPS database and returns the requested metadata to EDEX. 1) EDEX sends a data request to PyPIES. 2) PyPIES reads the data from the HDF5 data store and returns it to EDEX. In this case, if the data is not stored in HDF5, then Steps 3 and 4 are skipped. Data Retrieval Architecture \uf0c1 Data retrieval is the process by which the CAVE client obtains data using the EDEX Request server; the Request server obtains the data from processed data storage (Postgres and HDF5) and returns it to CAVE. CAVE sends a request via TCP to the EDEX Request server. EDEX Request server obtains the requested metadata via Postgres and stored data via PyPIES. EDEX Request forwards the requested data directly back to the CAVE client. For clustered EDEX servers using IPVS, this architecture allows CAVE clients to access any available EDEX Request process, providing an improvement in system reliability and speed. Data retrieval from processed data storage follows the same pattern as data storage: retrieval of HDF5 is handled by PyPIES; retrieval of database data is handled by Postgres. Data Purge Architecture \uf0c1 Raw data storage and processed data storage use two different purge mechanisms. For processed data storage, AWIPS implements a plugin based purge strategy, where the user has the option to change the purge frequency for each plugin individually. Raw Data Purge \uf0c1 Purging of Raw Data Storage is managed by the LDM user account cron, which executes the ldmadmin scour process, removing data files using an age-based strategy. The directories and retention times for raw data storage are controlled by scour.conf , which is located in the ldm user's ~/etc/ directory. Each entry in scour.conf contains the directory to manage, the retention time and an optional file name pattern for data files. An ldm user cron job executes ldmadmin. ldmadmin executes the LDM scour program. The LDM scour program deletes outdated data from AWIPS Raw Data Storage. Processed Data Purge \uf0c1 Rules for this version-based purge are contained in XML files located in /awips2/edex/data/utility/ . The purge is triggered by a quartz timer event that fires at 30 minutes after each hour. A Quartz event is triggered in the EDEX Ingest process causing the Purge Service to obtain a purge lock. If the lock is already taken, the Purge Service will exit, ensuring that only a single EDEX Ingest process performs the purge. The EDEX Purge Service sends a delete message to Postgres. Postgres deletes the specified data from the database. If HDF5 data is to be purged, the Purge Service messages PyPIES. PyPIES deletes the specified HDF5 files.","title":"Data flow"},{"location":"devguide/data-flow/#data-receipt","text":"The LDM obtains a data product from an upstream LDM site on the IDD. The LDM writes the data to file in Raw Data Storage. The LDM uses edexBridge to post a \u201cdata available\u201d message to the Qpid message broker. The EDEX Ingest process obtains the \u201cdata available\u201d message from Qpid and removes the message from the message queue. The EDEX Ingest process obtains the data files from Raw Data Storage. This architecture provides separation between data sources and ingest processing. Any data source, not just the LDM/IDD, can follow this architecture to provide data for EDEX to process.","title":"Data Receipt"},{"location":"devguide/data-flow/#data-decoding","text":"Data decoding is defined as the process of converting data from a raw format into a decoded format that is usable by CAVE. In AWIPS, data decoding is performed by the EDEX Ingest proessing ( ingest and ingestGrib ). EDEX Ingest obtains the \u201cdata available\u201d message from the Qpid message broker, and determines the appropriate data decoder based on the message contents. EDEX Ingest then forwards the message to the chosen decoder. Finally, the message is removed from the message queue. EDEX Ingest reads the data from Raw Data Storage. EDEX Ingest decodes the data. EDEX Ingest forwards the decoded data to Processed Data Storage. EDEX Ingest sends a message to the CAVE client indicating that newly-decoded data is available. It is important to note that in AWIPS all data types are processed by either the standard ingest process, or by the ingestGrib process, which handles all grib message ingestion. Once this data decoding process is complete, clients may obtain and perform additional processing on the data, including displaying data in CAVE.","title":"Data Decoding"},{"location":"devguide/data-flow/#processed-data-storage-architecture","text":"Processed Data Storage refers to the persistence of decoded data and occurs in two separate forms: 1) metadata and some decoded data, which is stored in Postgres database tables; and 2) imagery data, gridded forecast data, and certain point data, which is stored in HDF5 files, and is managed by PyPIES. Writing to Processed Data Storage involves the following: 1) The EDEX Process sends serialized data, area data, and certain point data to PyPIES. 2) PyPIES writes the data to the HDF5 data store. 3) EDEX send the metadata to the Postgres DBMS 4) Postgres writes the metadata to the AWIPS database. For data not stored in HDF5, Steps 1 and 2 are skipped. For processed data retrieval, the process is revered: 3) EDEX requests the metadata from Postgres. 4) Postgres reads the AWIPS database and returns the requested metadata to EDEX. 1) EDEX sends a data request to PyPIES. 2) PyPIES reads the data from the HDF5 data store and returns it to EDEX. In this case, if the data is not stored in HDF5, then Steps 3 and 4 are skipped.","title":"Processed Data Storage Architecture"},{"location":"devguide/data-flow/#data-retrieval-architecture","text":"Data retrieval is the process by which the CAVE client obtains data using the EDEX Request server; the Request server obtains the data from processed data storage (Postgres and HDF5) and returns it to CAVE. CAVE sends a request via TCP to the EDEX Request server. EDEX Request server obtains the requested metadata via Postgres and stored data via PyPIES. EDEX Request forwards the requested data directly back to the CAVE client. For clustered EDEX servers using IPVS, this architecture allows CAVE clients to access any available EDEX Request process, providing an improvement in system reliability and speed. Data retrieval from processed data storage follows the same pattern as data storage: retrieval of HDF5 is handled by PyPIES; retrieval of database data is handled by Postgres.","title":"Data Retrieval Architecture"},{"location":"devguide/data-flow/#data-purge-architecture","text":"Raw data storage and processed data storage use two different purge mechanisms. For processed data storage, AWIPS implements a plugin based purge strategy, where the user has the option to change the purge frequency for each plugin individually.","title":"Data Purge Architecture"},{"location":"devguide/data-flow/#raw-data-purge","text":"Purging of Raw Data Storage is managed by the LDM user account cron, which executes the ldmadmin scour process, removing data files using an age-based strategy. The directories and retention times for raw data storage are controlled by scour.conf , which is located in the ldm user's ~/etc/ directory. Each entry in scour.conf contains the directory to manage, the retention time and an optional file name pattern for data files. An ldm user cron job executes ldmadmin. ldmadmin executes the LDM scour program. The LDM scour program deletes outdated data from AWIPS Raw Data Storage.","title":"Raw Data Purge"},{"location":"devguide/data-flow/#processed-data-purge","text":"Rules for this version-based purge are contained in XML files located in /awips2/edex/data/utility/ . The purge is triggered by a quartz timer event that fires at 30 minutes after each hour. A Quartz event is triggered in the EDEX Ingest process causing the Purge Service to obtain a purge lock. If the lock is already taken, the Purge Service will exit, ensuring that only a single EDEX Ingest process performs the purge. The EDEX Purge Service sends a delete message to Postgres. Postgres deletes the specified data from the database. If HDF5 data is to be purged, the Purge Service messages PyPIES. PyPIES deletes the specified HDF5 files.","title":"Processed Data Purge"},{"location":"devguide/file-system/","text":"The major file systems on the Linux-OS EDEX Data Server are as follows: Linux File Systems \uf0c1 root ( / ), /tmp , /usr , /var . Linux mandates that these file systems exist. /boot . This file system contains the Linux kernel and boot-up instructions. /home . This file system contains all the user working areas. /dev/shm . This file system is the Linux shared memory. /etc/init.d . Location of startup services ( edex_postgres , httpd-pypies , qpidd , edex_camel ). AWIPS File Systems \uf0c1 /awips2 . This file system is used to store baselined AWIPS software. /awips2/database/data . Database files. /awips2/edex/data/hdf5 . Contains the HDF5 component of the data store and shared static data and hydro apps. /awips2/GFESuite . Contains scripts and data relating to inter site coordination (ISC) and service backup. /awips2/edex/data/utility . Contains localization store and EDEX configuration files. /awips2/httpd_pypies/etc/https/conf . Location of PyPIES Apache server configuration file httpd.conf . /awips2/qpid/etc . Location of Qpid configuration file qpidd.conf . /awips2/qpid/sbin . Location of qpidd executable and queueCreator.sh , which is called by /etc/init.d/qpidd . /awips2/ldm . LDM account home directory. /awips2/ldm/etc . Location of ldmd.conf and pqact.conf . /awips2/ldm/logs . Location of LDM logs. Raw Data Store File System \uf0c1 Folders are usually laid out exactly like the sbn folders on the EDEX server with each plug-in having a folder on the data store. But some of them do not follow the same convention, for e.g., data sent to the 'metar' endpoint will be found in the /data_store/text folder. Additionally, if ingest of a new format is being worked on, you will find these new data types not yet found on the development or integration systems, located in /data_store/experimental .","title":"File system"},{"location":"devguide/file-system/#linux-file-systems","text":"root ( / ), /tmp , /usr , /var . Linux mandates that these file systems exist. /boot . This file system contains the Linux kernel and boot-up instructions. /home . This file system contains all the user working areas. /dev/shm . This file system is the Linux shared memory. /etc/init.d . Location of startup services ( edex_postgres , httpd-pypies , qpidd , edex_camel ).","title":"Linux File Systems"},{"location":"devguide/file-system/#awips-file-systems","text":"/awips2 . This file system is used to store baselined AWIPS software. /awips2/database/data . Database files. /awips2/edex/data/hdf5 . Contains the HDF5 component of the data store and shared static data and hydro apps. /awips2/GFESuite . Contains scripts and data relating to inter site coordination (ISC) and service backup. /awips2/edex/data/utility . Contains localization store and EDEX configuration files. /awips2/httpd_pypies/etc/https/conf . Location of PyPIES Apache server configuration file httpd.conf . /awips2/qpid/etc . Location of Qpid configuration file qpidd.conf . /awips2/qpid/sbin . Location of qpidd executable and queueCreator.sh , which is called by /etc/init.d/qpidd . /awips2/ldm . LDM account home directory. /awips2/ldm/etc . Location of ldmd.conf and pqact.conf . /awips2/ldm/logs . Location of LDM logs.","title":"AWIPS File Systems"},{"location":"devguide/file-system/#raw-data-store-file-system","text":"Folders are usually laid out exactly like the sbn folders on the EDEX server with each plug-in having a folder on the data store. But some of them do not follow the same convention, for e.g., data sent to the 'metar' endpoint will be found in the /data_store/text folder. Additionally, if ingest of a new format is being worked on, you will find these new data types not yet found on the development or integration systems, located in /data_store/experimental .","title":"Raw Data Store File System"},{"location":"devguide/linux-tools/","text":"Several standard Linux tools can be used to monitor the EDEX processes, and for the purposes of this document and the Unidata AWIPS Training Workshop, it is assumed that all are available and that the user has some knowledge of how they are used. Regardless, this document includes the full command syntax that can be copy and pasted from the document to the terminal. ps - Display information about specific processes ps aux | grep edex cat - Used to display a text file in a terminal cat /awips2/ldm/etc/pqact.conf tail - Used to provide a dynamic picture of process logs tail -f /awips2/ldm/logs/ldmd.conf grep - Used to filter content of process logs; used to filter output of other tools grep edexBridge /awips2/ldm/etc/ldmd.conf top - Provides a dynamic view of the memory and cpu usage of the EDEX processes psql - A terminal-based front-end to PostgreSQL. We will be executing SQL queries. You do not need to have previous experience with SQL to follow this guide, but navigating AWIPS metadata is made much easier with some experience. [awips@edex ~]$ psql metadata Type \"help\" for help. metadata=# help You are using psql, the command-line interface to PostgreSQL. Type: \\copyright for distribution terms \\h for help with SQL commands \\? for help with psql commands \\g or terminate with semicolon to execute query \\q to quit metadata=# \\dt sat* List of relations Schema | Name | Type | Owner --------+-----------------------------------+-------+------- awips | satellite | table | awips awips | satellite_creating_entities | table | awips awips | satellite_geostationary_positions | table | awips awips | satellite_physical_elements | table | awips awips | satellite_sector_ids | table | awips awips | satellite_sources | table | awips awips | satellite_spatial | table | awips awips | satellite_units | table | awips (8 rows) metadata=# \\q","title":"Linux tools"},{"location":"devguide/regular-expressions/","text":"AWIPS uses regular expressions for data filtering at two steps in the ingest process: the LDM uses regular expressions to determine which data to write to /awips2/data_store /. An example for radars products defined in /awips2/ldm/etc/pqact.conf NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad The FILE option determines the actions on the product, in this case the name of the file (using \\n numeration) as determined by the values captured inside of parentheses ( read more about LDM pattern actions... ) EDEX Ingest uses regular expressions to determine routing of raw data to decoder plug-ins based on WMO header and file name ( Read more about WMO headers... ). An example for products defined in /awips2/edex/data/utility/edex_static/base/distribution/radar.xml ^SDUS[234578]. . ^Level2_. ^Level3_.* Standard LDM regular expressions from /awips2/ldm/etc/pqact.conf Level 3 Radar (All) \uf0c1 NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad Level 3 Radar (Subset) \uf0c1 NEXRAD3 ^(SDUS[23578].) .... (......) /p(DHR|DPR|DSP|DTA|DAA|DU3|DU6|DVL|EET|HHC|N3P|N0C|N0K|N0Q|N0S|N0U|N0X|N0Z|NCR|NMD|OHA)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad FNEXRAD Composites \uf0c1 FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....) FILE -close -edex /awips2/data_store/sat/nexrcomp_\\3\\4_\\2.gini.png Satellite Imagery \uf0c1 # NOAAPORT GINI images NIMAGE ^satz/ch[0-9]/.*/(.*)/([12][0-9])([0-9][0-9])([01][0-9])([0-3][0-9]) ([0-2][0-9])([0-5][0-9])/(.*)/(.*km)/ FILE -close -overwrite -edex /awips2/data_store/sat/\\8/\\9/\\1_\\2\\3\\4\\5_\\6\\7 # -------- GOES-East/West Northern Hemisphere Composites -------- # GOES-East/West VIS composites UNIWISC ^pnga2area Q. (CV) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_VIS_VIS_\\6_\\7 # GOES-East/West 3.9 um composites UNIWISC ^pnga2area Q. (CS) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_3.9_3.9_\\6_\\7 # GOES-East/West WV composites UNIWISC ^pnga2area Q. (CW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_WV_WV_\\6_\\7 # GOES-East/West IR composites UNIWISC ^pnga2area Q. (CI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_IR_IR_\\6_\\7 # GOES-East/West 13.3 um composites UNIWISC ^pnga2area Q. (CL) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_13.3_13.3_\\6_\\7 # ------------------- SSEC Global Composites ------------------- # Global WV composite UNIWISC ^pnga2area Q. (GW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_WV_WVCOMP_\\6_\\7 # Global IR composites UNIWISC ^pnga2area Q. (GI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_IR_IRCOMP_\\6_\\7 # ----------------- Mollweide Global Composites ----------------- # Mollweide Global Water Vapor UNIWISC ^pnga2area Q. (UY) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_WV_MOLLWV_\\6_\\7 # Mollweide Global IR UNIWISC ^pnga2area Q. (UX) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_IR_MOLLIR_\\6_\\7 # These work # GOES Visible (UV 4km VIS disabled) UNIWISC ^pnga2area Q. (EV|U9) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_VIS_\\4_\\6_\\7 # GOES Water Vapor UNIWISC ^pnga2area Q. (UW|UB) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_WV_\\4_\\6_\\7 # GOES Thermal Infrared UNIWISC ^pnga2area Q. (UI|U5) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_IR_\\4_\\6_\\7 # GOES other UNIWISC ^pnga2area Q. (UD|UE|U7|U8|) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_\\4_\\6_\\7 # Arctic UNIWISC ^pnga2area Q. (U[LNGHO]) (.*) (.*) (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ARCTIC_4km_\\4_\\6_\\7 # Antarctic VIS Composite UNIWISC ^pnga2area Q. (UJ) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_VIS_\\3_\\4_\\6_\\7 # Antarctic PCOL Composite UNIWISC ^pnga2area Q. (UK) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_PCOL_\\3_\\4_\\6_\\7 # Antarctic WV Composite UNIWISC ^pnga2area Q. (UF) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_WV_\\3_\\4_\\6_\\7 # Antarctic Composite IR UNIWISC ^pnga2area Q. (U1) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_IR_\\3_\\4_\\6_\\7 # GOES Sounder Derived Image Products from University of Wisconsin CIMSS # CIMSS CAPE - McIDAS product code CE UNIWISC ^pnga2area Q0 CE .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CAPE_\\4_\\5 # CIMSS Cloud Top Pressure - McIDAS product code CA UNIWISC ^pnga2area Q0 CA .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CTP_\\4_\\5 # CIMSS Lifted Index - McIDAS product code CD UNIWISC ^pnga2area Q0 CD .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_LI_\\4_\\5 # CIMSS Ozone - McIDAS product code CF UNIWISC ^pnga2area Q0 CF .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_OZONE_\\4_\\5 # CIMSS Total Column Precipitable Water - McIDAS product code CB UNIWISC ^pnga2area Q0 CB .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_PW_\\4_\\5 # CIMSS Sea Surface Temperature - McIDAS product code CC UNIWISC ^pnga2area Q0 CC .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_SST_\\4_\\5 # CIMSS Northern Hemisphere Wildfire ABBA - McIDAS product code CG (inactive) UNIWISC ^pnga2area Q0 CG (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESNH_\\4_\\5 # CIMSS Southern Hemisphere Wildfire ABBA - McIDAS product code CH (inactive) UNIWISC ^pnga2area Q0 CH (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESSH_\\4_\\5 Gridded Model Data \uf0c1 # GFS 0.5 deg (gfs.tCCz.pgrb2.0p50.fFFF) all hours out to F384 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.(pgrb2.0p50).*!(grib2)/[^/]*/(SSIGFS|GFS)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/GFS/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NAM-40km (awip3d) - exclude awip12 = NAM12 since it is on NGRID (exclude NAM 90km) CONDUIT ^data/nccf/com/nam/.*nam.*(awip3d).*!(grib2)/ncep/(NAM_84)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/\\3/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NOAAport HRRR NGRID Y.C.[0-9][0-9] KWBY ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/HRRR/\\1_F\\2_\\3_(seq).grib2 # GFS40 40km NGRID ^[LM].R... KWBC ...... !grib2/[^/]*/([^/]*)/#(212)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RAP-13km NGRID ^[LM].D... KWBG ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA 197 (5km) NGRID ^[LM].M... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA-Mosaic 2.5km (I) and URMA2.5 (Q) NGRID ^[LM].[IQ]... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NamDNG 2.5 and 5km NGRID ^[LM].[IM]... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NAM12 (#218) NGRID ^[LM].B... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # GEM 000 CMC_reg_USWRF_NTAT_0_ps15km_2015042818_P003.grib2 CMC CMC_reg_(.*)km_(..........)_P(...).grib2 FILE -overwrite -log -close -edex /awips2/data_store/grib2/cmc/cmc_reg_\\1km_\\2_P\\3.grib2 # FNMOC FNMOC ^US058.*(0018_0056|0022_0179|0027_0186|0060_0188|0063_0187|0110_0240|0111_0179|0135_0240|0078_0200)_(.*)_(.*)_(.*)-.* FILE -log -overwrite -close -edex /awips2/data_store/grib2/fnmoc/US_058_\\1_\\2_\\3_\\4-(seq).grib","title":"Regular expressions"},{"location":"devguide/regular-expressions/#level-3-radar-all","text":"NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad","title":"Level 3 Radar (All)"},{"location":"devguide/regular-expressions/#level-3-radar-subset","text":"NEXRAD3 ^(SDUS[23578].) .... (......) /p(DHR|DPR|DSP|DTA|DAA|DU3|DU6|DVL|EET|HHC|N3P|N0C|N0K|N0Q|N0S|N0U|N0X|N0Z|NCR|NMD|OHA)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad","title":"Level 3 Radar (Subset)"},{"location":"devguide/regular-expressions/#fnexrad-composites","text":"FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....) FILE -close -edex /awips2/data_store/sat/nexrcomp_\\3\\4_\\2.gini.png","title":"FNEXRAD Composites"},{"location":"devguide/regular-expressions/#satellite-imagery","text":"# NOAAPORT GINI images NIMAGE ^satz/ch[0-9]/.*/(.*)/([12][0-9])([0-9][0-9])([01][0-9])([0-3][0-9]) ([0-2][0-9])([0-5][0-9])/(.*)/(.*km)/ FILE -close -overwrite -edex /awips2/data_store/sat/\\8/\\9/\\1_\\2\\3\\4\\5_\\6\\7 # -------- GOES-East/West Northern Hemisphere Composites -------- # GOES-East/West VIS composites UNIWISC ^pnga2area Q. (CV) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_VIS_VIS_\\6_\\7 # GOES-East/West 3.9 um composites UNIWISC ^pnga2area Q. (CS) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_3.9_3.9_\\6_\\7 # GOES-East/West WV composites UNIWISC ^pnga2area Q. (CW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_WV_WV_\\6_\\7 # GOES-East/West IR composites UNIWISC ^pnga2area Q. (CI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_IR_IR_\\6_\\7 # GOES-East/West 13.3 um composites UNIWISC ^pnga2area Q. (CL) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_13.3_13.3_\\6_\\7 # ------------------- SSEC Global Composites ------------------- # Global WV composite UNIWISC ^pnga2area Q. (GW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_WV_WVCOMP_\\6_\\7 # Global IR composites UNIWISC ^pnga2area Q. (GI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_IR_IRCOMP_\\6_\\7 # ----------------- Mollweide Global Composites ----------------- # Mollweide Global Water Vapor UNIWISC ^pnga2area Q. (UY) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_WV_MOLLWV_\\6_\\7 # Mollweide Global IR UNIWISC ^pnga2area Q. (UX) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_IR_MOLLIR_\\6_\\7 # These work # GOES Visible (UV 4km VIS disabled) UNIWISC ^pnga2area Q. (EV|U9) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_VIS_\\4_\\6_\\7 # GOES Water Vapor UNIWISC ^pnga2area Q. (UW|UB) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_WV_\\4_\\6_\\7 # GOES Thermal Infrared UNIWISC ^pnga2area Q. (UI|U5) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_IR_\\4_\\6_\\7 # GOES other UNIWISC ^pnga2area Q. (UD|UE|U7|U8|) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_\\4_\\6_\\7 # Arctic UNIWISC ^pnga2area Q. (U[LNGHO]) (.*) (.*) (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ARCTIC_4km_\\4_\\6_\\7 # Antarctic VIS Composite UNIWISC ^pnga2area Q. (UJ) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_VIS_\\3_\\4_\\6_\\7 # Antarctic PCOL Composite UNIWISC ^pnga2area Q. (UK) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_PCOL_\\3_\\4_\\6_\\7 # Antarctic WV Composite UNIWISC ^pnga2area Q. (UF) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_WV_\\3_\\4_\\6_\\7 # Antarctic Composite IR UNIWISC ^pnga2area Q. (U1) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_IR_\\3_\\4_\\6_\\7 # GOES Sounder Derived Image Products from University of Wisconsin CIMSS # CIMSS CAPE - McIDAS product code CE UNIWISC ^pnga2area Q0 CE .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CAPE_\\4_\\5 # CIMSS Cloud Top Pressure - McIDAS product code CA UNIWISC ^pnga2area Q0 CA .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CTP_\\4_\\5 # CIMSS Lifted Index - McIDAS product code CD UNIWISC ^pnga2area Q0 CD .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_LI_\\4_\\5 # CIMSS Ozone - McIDAS product code CF UNIWISC ^pnga2area Q0 CF .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_OZONE_\\4_\\5 # CIMSS Total Column Precipitable Water - McIDAS product code CB UNIWISC ^pnga2area Q0 CB .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_PW_\\4_\\5 # CIMSS Sea Surface Temperature - McIDAS product code CC UNIWISC ^pnga2area Q0 CC .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_SST_\\4_\\5 # CIMSS Northern Hemisphere Wildfire ABBA - McIDAS product code CG (inactive) UNIWISC ^pnga2area Q0 CG (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESNH_\\4_\\5 # CIMSS Southern Hemisphere Wildfire ABBA - McIDAS product code CH (inactive) UNIWISC ^pnga2area Q0 CH (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESSH_\\4_\\5","title":"Satellite Imagery"},{"location":"devguide/regular-expressions/#gridded-model-data","text":"# GFS 0.5 deg (gfs.tCCz.pgrb2.0p50.fFFF) all hours out to F384 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.(pgrb2.0p50).*!(grib2)/[^/]*/(SSIGFS|GFS)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/GFS/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NAM-40km (awip3d) - exclude awip12 = NAM12 since it is on NGRID (exclude NAM 90km) CONDUIT ^data/nccf/com/nam/.*nam.*(awip3d).*!(grib2)/ncep/(NAM_84)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/\\3/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NOAAport HRRR NGRID Y.C.[0-9][0-9] KWBY ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/HRRR/\\1_F\\2_\\3_(seq).grib2 # GFS40 40km NGRID ^[LM].R... KWBC ...... !grib2/[^/]*/([^/]*)/#(212)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RAP-13km NGRID ^[LM].D... KWBG ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA 197 (5km) NGRID ^[LM].M... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA-Mosaic 2.5km (I) and URMA2.5 (Q) NGRID ^[LM].[IQ]... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NamDNG 2.5 and 5km NGRID ^[LM].[IM]... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NAM12 (#218) NGRID ^[LM].B... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # GEM 000 CMC_reg_USWRF_NTAT_0_ps15km_2015042818_P003.grib2 CMC CMC_reg_(.*)km_(..........)_P(...).grib2 FILE -overwrite -log -close -edex /awips2/data_store/grib2/cmc/cmc_reg_\\1km_\\2_P\\3.grib2 # FNMOC FNMOC ^US058.*(0018_0056|0022_0179|0027_0186|0060_0188|0063_0187|0110_0240|0111_0179|0135_0240|0078_0200)_(.*)_(.*)_(.*)-.* FILE -log -overwrite -close -edex /awips2/data_store/grib2/fnmoc/US_058_\\1_\\2_\\3_\\4-(seq).grib","title":"Gridded Model Data"},{"location":"edex/archiver/","text":"Grant Users Permission to Create Case Study Archives \uf0c1 The file /awips2/edex/data/utility/common_static/base/roles/archiveAdminRoles.xml controls which users can run the archiving tools from CAVE. Data Archiving This permission allows the user to access Archive Retention. This permission allows the user to access Archive Case Creation. archive.retention archive.casecreation will allow any connected CAVE user to run both the Archive Retention and the Archive Case Creation tools. If you want to control access to individual users, such as the example bwlo, which will allow any user to create case studies, but only the awips user to run the Archive Retention tool. archive.retention archive.casecreation Define EDEX User Administration Roles \uf0c1 Admins can use the CAVE User Administration interface to manage user access roles. The file /awips2/edex/data/utility/common_static/base/roles/awipsUserAdminRoles.xml controls access to this tool. User Administration This permission allows the user to access and edit AWIPS 2 User Administration awips.user.admin EDEX Archiver \uf0c1 /awips2/edex/conf/resources/com.raytheon.uf.edex.archive.cron.properties # enable archive archive.enable=false # runs database and hdf5 archive for archive server to pull data from archive.cron=0+40+*+*+*+? # path to store processed archive data archive.path=/awips2/archive # enable archive purge archive.purge.enable=true # when to purge archives archive.purge.cron=0+5+0/2+*+*+? # compress database records archive.compression.enable=false # To change Default case directory. #archive.case.directory=/awips2/edex/data/archiver/ # to disable a specific archive, use property archive.disable=pluginName,pluginName... #archive.disable=grid,text,acars The EDEX Archiver plugin can be used to automate data backup or create case study archive files to be retained by EDEX. The file /awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml controls which products are ardhived, and how. Archive Log \uf0c1 The file /awips2/edex/logs/edex-ingest-archive-*.log will report status of the archiver whenever it is run. With regular archiving disabled (by default) will see messages such as INFO 2016-11-30 09:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 10:40:00,009 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 11:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 12:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting /awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml \uf0c1 , , , and are the four tags which configure the EDEX Archiver. Processed /awips2/archive/ 168 Model 168 (grid)/(.*)/(.*)/.*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})-.* {2} 4,5,6,7 (modelsounding)/(.*)/.*/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}).* (bufrmos)(.*)/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}) {1} - {2} 3,4,5,6 is used as a logical grouping of the archive sub-directories, and contains the following tags: - The id for the category, used in CAVE. - Optional. The hours to retain data in directories of selected Data Sets for a category. Default is 1 hour. - A directory matching . These are selected directories from the Retention GUI. The purger will used the category's instead of the Arhivie's . An Optional. may have more then one. (NOTE these are set internally when a selection configuration file is loaded. They should not appear in the configuration file.) - Required to have a least one. Each one contains a set of tags explained below. The tag contains: - A regex pattern for finding directories for this category. Required to have at least one. The pattern is relative to . Wildcard patterns do not cross directory delimiter / . Thus to match 3 levels of directories you would need .*/.*/.* (see patterns and groups section). There may be more then one in a , but they must all have the same number of groupings and be in the same order to match up with , and . - Optional. A pattern to find files in the directories that match . Default is everything in the directories that match . See patterns and groups section. - The label to display for directories that match . Any group in the may be made part of the label by placing the group's index inside parenthesis, {1} . More then one directory may match the . The archive GUIs may collapse them into a single table entry. - Optional tag to determine what type of time stamp is being used to get files/directories for retention and case creation. The value dictates how many groupings in the s and/or are used to get the time stamp for a file. The five values are: Date - (default) the time stamp is made up of 3 or 4 groups in the patterns: year , month , day and (optional) hour . Julian - Time stamp is made up of 2 or 3 groups in the patterns: year , day_of_year and (optional) hour . EpochSec - Time stamp epoch time in seconds. EpochMS - Time stamp epoch time in milliseconds. File - Instead use the files date of last modification. No group is used to get the time stamp. - Required tag when has any value but File . Date - A comma separated list of 3 or 4 numbers which are in order the index for year , month , day and hour . When only 3 numbers the hour is value is 23. Julian - A comma separated list of 2 or 3 numbers which are in order the index for year , day of year , and hour . When only two numbers the hour value is 23. EpochSec - A number which is the index for the epoch in seconds. EpochMS - A number which is the index for the epoch in milliseconds. File - Not needed since no group is used to get the time stamp. This is used to determine what files/directories to retain or a range of directories/files to copy for case creation. Note to get the group's index the and are combined. Thus if there are 5 groups in the then the first group in the is index 6. ## Patterns and groups. and use Java regex expressions , similar to the ldm's pqact.conf file. The groupings index start at one. The groups in the can be used in the . For example: (grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*) {1} - {6} 2,3,4,5 contains six groups. The first group is the literal grib2 which matches only a directory named grib2 that is a sub-directory of the . The groups 2, 3 and 4 break apart the next level of sub-directories into a 4 digit and two 2 digit groups. This is the expected year , month , day sub-subdirectory indicated by the first 3 entries in . The next sub-directory contains the fifth group which is a two digit number representing the hour. Finally the sixth group will match any sub-directory that in the hour directory. Thus the directory paths /grib2/20130527/18/GFS will generate the display string, grib2 - GFS , and from the grouping we can find the year, 2013 ; month, 05 ; day, 27 and hour, 18 . Example with \uf0c1 hdf5/(redbook) {1} redbook-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})\\..* 2,3,4,5 Example with multiple \uf0c1 Observation 168 (acars|airep|airmet|taf) (bufrsigwx|sfcobs)/.* {1} Date 2,3,4,5 .*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})\\..* The first looks for files matching the in the directories acars , airep , airmet or taf . The second expects to find the files in subdirectories of bufrsigwx or sfcobs such as bufrsigwx/SWH . Here the display will only show, redbook. The directory looked at will be /redbook/ . The all come from the since there is one group in the the groups in the start at two. This matches file names redbook-YYYY-MM-DD-HH. . Thus the file name redbook-2013-05-28-00.hd5 would match the . NOTE group {0} is a string that matches the whole . If this is used in the would see every directory that matches the pattern.","title":"Grant Users Permission to Create Case Study Archives"},{"location":"edex/archiver/#grant-users-permission-to-create-case-study-archives","text":"The file /awips2/edex/data/utility/common_static/base/roles/archiveAdminRoles.xml controls which users can run the archiving tools from CAVE. Data Archiving This permission allows the user to access Archive Retention. This permission allows the user to access Archive Case Creation. archive.retention archive.casecreation will allow any connected CAVE user to run both the Archive Retention and the Archive Case Creation tools. If you want to control access to individual users, such as the example bwlo, which will allow any user to create case studies, but only the awips user to run the Archive Retention tool. archive.retention archive.casecreation ","title":"Grant Users Permission to Create Case Study Archives"},{"location":"edex/archiver/#define-edex-user-administration-roles","text":"Admins can use the CAVE User Administration interface to manage user access roles. The file /awips2/edex/data/utility/common_static/base/roles/awipsUserAdminRoles.xml controls access to this tool. User Administration This permission allows the user to access and edit AWIPS 2 User Administration awips.user.admin ","title":"Define EDEX User Administration Roles"},{"location":"edex/archiver/#edex-archiver","text":"/awips2/edex/conf/resources/com.raytheon.uf.edex.archive.cron.properties # enable archive archive.enable=false # runs database and hdf5 archive for archive server to pull data from archive.cron=0+40+*+*+*+? # path to store processed archive data archive.path=/awips2/archive # enable archive purge archive.purge.enable=true # when to purge archives archive.purge.cron=0+5+0/2+*+*+? # compress database records archive.compression.enable=false # To change Default case directory. #archive.case.directory=/awips2/edex/data/archiver/ # to disable a specific archive, use property archive.disable=pluginName,pluginName... #archive.disable=grid,text,acars The EDEX Archiver plugin can be used to automate data backup or create case study archive files to be retained by EDEX. The file /awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml controls which products are ardhived, and how.","title":"EDEX Archiver"},{"location":"edex/archiver/#archive-log","text":"The file /awips2/edex/logs/edex-ingest-archive-*.log will report status of the archiver whenever it is run. With regular archiving disabled (by default) will see messages such as INFO 2016-11-30 09:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 10:40:00,009 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 11:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 12:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting","title":"Archive Log"},{"location":"edex/archiver/#awips2edexdatautilitycommon_staticbasearchiverpurgerprocessed_dataxml","text":" , , , and are the four tags which configure the EDEX Archiver. Processed /awips2/archive/ 168 Model 168 (grid)/(.*)/(.*)/.*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})-.* {2} 4,5,6,7 (modelsounding)/(.*)/.*/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}).* (bufrmos)(.*)/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}) {1} - {2} 3,4,5,6 is used as a logical grouping of the archive sub-directories, and contains the following tags: - The id for the category, used in CAVE. - Optional. The hours to retain data in directories of selected Data Sets for a category. Default is 1 hour. - A directory matching . These are selected directories from the Retention GUI. The purger will used the category's instead of the Arhivie's . An Optional. may have more then one. (NOTE these are set internally when a selection configuration file is loaded. They should not appear in the configuration file.) - Required to have a least one. Each one contains a set of tags explained below. The tag contains: - A regex pattern for finding directories for this category. Required to have at least one. The pattern is relative to . Wildcard patterns do not cross directory delimiter / . Thus to match 3 levels of directories you would need .*/.*/.* (see patterns and groups section). There may be more then one in a , but they must all have the same number of groupings and be in the same order to match up with , and . - Optional. A pattern to find files in the directories that match . Default is everything in the directories that match . See patterns and groups section. - The label to display for directories that match . Any group in the may be made part of the label by placing the group's index inside parenthesis, {1} . More then one directory may match the . The archive GUIs may collapse them into a single table entry. - Optional tag to determine what type of time stamp is being used to get files/directories for retention and case creation. The value dictates how many groupings in the s and/or are used to get the time stamp for a file. The five values are: Date - (default) the time stamp is made up of 3 or 4 groups in the patterns: year , month , day and (optional) hour . Julian - Time stamp is made up of 2 or 3 groups in the patterns: year , day_of_year and (optional) hour . EpochSec - Time stamp epoch time in seconds. EpochMS - Time stamp epoch time in milliseconds. File - Instead use the files date of last modification. No group is used to get the time stamp. - Required tag when has any value but File . Date - A comma separated list of 3 or 4 numbers which are in order the index for year , month , day and hour . When only 3 numbers the hour is value is 23. Julian - A comma separated list of 2 or 3 numbers which are in order the index for year , day of year , and hour . When only two numbers the hour value is 23. EpochSec - A number which is the index for the epoch in seconds. EpochMS - A number which is the index for the epoch in milliseconds. File - Not needed since no group is used to get the time stamp. This is used to determine what files/directories to retain or a range of directories/files to copy for case creation. Note to get the group's index the and are combined. Thus if there are 5 groups in the then the first group in the is index 6. ## Patterns and groups. and use Java regex expressions , similar to the ldm's pqact.conf file. The groupings index start at one. The groups in the can be used in the . For example: (grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*) {1} - {6} 2,3,4,5 contains six groups. The first group is the literal grib2 which matches only a directory named grib2 that is a sub-directory of the . The groups 2, 3 and 4 break apart the next level of sub-directories into a 4 digit and two 2 digit groups. This is the expected year , month , day sub-subdirectory indicated by the first 3 entries in . The next sub-directory contains the fifth group which is a two digit number representing the hour. Finally the sixth group will match any sub-directory that in the hour directory. Thus the directory paths /grib2/20130527/18/GFS will generate the display string, grib2 - GFS , and from the grouping we can find the year, 2013 ; month, 05 ; day, 27 and hour, 18 .","title":"/awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml"},{"location":"edex/archiver/#example-with-filepattern","text":"hdf5/(redbook) {1} redbook-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})\\..* 2,3,4,5","title":"Example with <filePattern>"},{"location":"edex/archiver/#example-with-multiple-dirpattern","text":" Observation 168 (acars|airep|airmet|taf) (bufrsigwx|sfcobs)/.* {1} Date 2,3,4,5 .*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})\\..* The first looks for files matching the in the directories acars , airep , airmet or taf . The second expects to find the files in subdirectories of bufrsigwx or sfcobs such as bufrsigwx/SWH . Here the display will only show, redbook. The directory looked at will be /redbook/ . The all come from the since there is one group in the the groups in the start at two. This matches file names redbook-YYYY-MM-DD-HH. . Thus the file name redbook-2013-05-28-00.hd5 would match the . NOTE group {0} is a string that matches the whole . If this is used in the would see every directory that matches the pattern.","title":"Example with multiple <dirPattern>"},{"location":"edex/case-studies/","text":"Case Study Server Configuration \uf0c1 This document covers what is necessary to install and run AWIPS EDEX as an archive and case study server (no purging of processed data). Quick Install \uf0c1 Follow the EDEX Install Instructions including iptables config and an optional SSD mount (for large data volumes). groupadd fxalpha && useradd -G fxalpha awips mkdir -p /awips2/data_store wget -O /etc/yum.repos.d/awips2.repo https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo yum clean all yum groupinstall awips2-server -y Disable Data Purging \uf0c1 The easiest way to disable data purging is to add an purge.* entry in /awips2/edex/conf/modes/modes.xml so that the purge plugin is not loaded when the EDEX ingest JVM is started: vi /awips2/edex/conf/modes/modes.xml .*request.* edex-security.xml ... purge.* ... Start EDEX \uf0c1 Start EDEX without running the LDM, since we do not want current data. Run the following command: edex start base Double check everything is running, except the LDM: edex [edex status] postgres :: running :: pid 43644 pypies :: running :: pid 3557 qpid :: running :: pid 43742 EDEXingest :: running :: pid 6564 44301 44597 EDEXgrib :: running :: pid 6565 44302 44598 EDEXrequest :: running :: pid 6566 44303 44599 ldmadmin :: not running Ingest Case Study Data \uf0c1 Raw data files of any type can be copied or moved into /awips2/data_store/ingest/ to be picked up and decoded by EDEX. Most data types are recognized by regular expression matching of the WMO Header or filename. Individual files can be ingested on the command line with the regex header/pattern supplied as the last argument: qpidNotify.py /full/path/to/data.file [regex match] For example: qpidNotify.py /home/awips/uniwisc_U5_132GOES-15_IMG10.7um_4km_20171024_1830.area.png uniwisc qpidNotify.py /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_20171025_1200Z_F084_TMPK-7.000007.grib2 grib qpidNotify.py /awips2/data_store/radar/FTG_N0Q_20171015_1815 Level3 Viewing Archive Data in CAVE \uf0c1 Because we are installing and configuring a standalone EDEX archive server without real-time LDM data ingest (and with purge disabled), any case study data that is ingested will be the \"latest available\" to CAVE, and you will see CAVE product menu time fill in with the latest of all data ingested. However, to display specific time-based data (in case you ingest more than one case study), there are two options: Set Load Mode to Inventory \uf0c1 In the top-left toolbar change Valid time seq to Inventory . Now any data product selected from the menus or the Product Browser should prompt you to select the exact time. Set Data Display Time in CAVE \uf0c1 At the bottom of the CAVE application, double-click the Time: entry to bring up a dialog window where you can set CAVE to a previous time, and choose the option of freezing CAVE at that time or allowing CAVE to \"move forward in time\" from that position as if it were real-time.","title":"Archive Case Studies"},{"location":"edex/case-studies/#case-study-server-configuration","text":"This document covers what is necessary to install and run AWIPS EDEX as an archive and case study server (no purging of processed data).","title":"Case Study Server Configuration"},{"location":"edex/case-studies/#quick-install","text":"Follow the EDEX Install Instructions including iptables config and an optional SSD mount (for large data volumes). groupadd fxalpha && useradd -G fxalpha awips mkdir -p /awips2/data_store wget -O /etc/yum.repos.d/awips2.repo https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo yum clean all yum groupinstall awips2-server -y","title":"Quick Install"},{"location":"edex/case-studies/#disable-data-purging","text":"The easiest way to disable data purging is to add an purge.* entry in /awips2/edex/conf/modes/modes.xml so that the purge plugin is not loaded when the EDEX ingest JVM is started: vi /awips2/edex/conf/modes/modes.xml .*request.* edex-security.xml ... purge.* ... ","title":"Disable Data Purging"},{"location":"edex/case-studies/#start-edex","text":"Start EDEX without running the LDM, since we do not want current data. Run the following command: edex start base Double check everything is running, except the LDM: edex [edex status] postgres :: running :: pid 43644 pypies :: running :: pid 3557 qpid :: running :: pid 43742 EDEXingest :: running :: pid 6564 44301 44597 EDEXgrib :: running :: pid 6565 44302 44598 EDEXrequest :: running :: pid 6566 44303 44599 ldmadmin :: not running","title":"Start EDEX"},{"location":"edex/case-studies/#ingest-case-study-data","text":"Raw data files of any type can be copied or moved into /awips2/data_store/ingest/ to be picked up and decoded by EDEX. Most data types are recognized by regular expression matching of the WMO Header or filename. Individual files can be ingested on the command line with the regex header/pattern supplied as the last argument: qpidNotify.py /full/path/to/data.file [regex match] For example: qpidNotify.py /home/awips/uniwisc_U5_132GOES-15_IMG10.7um_4km_20171024_1830.area.png uniwisc qpidNotify.py /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_20171025_1200Z_F084_TMPK-7.000007.grib2 grib qpidNotify.py /awips2/data_store/radar/FTG_N0Q_20171015_1815 Level3","title":"Ingest Case Study Data"},{"location":"edex/case-studies/#viewing-archive-data-in-cave","text":"Because we are installing and configuring a standalone EDEX archive server without real-time LDM data ingest (and with purge disabled), any case study data that is ingested will be the \"latest available\" to CAVE, and you will see CAVE product menu time fill in with the latest of all data ingested. However, to display specific time-based data (in case you ingest more than one case study), there are two options:","title":"Viewing Archive Data in CAVE"},{"location":"edex/case-studies/#set-load-mode-to-inventory","text":"In the top-left toolbar change Valid time seq to Inventory . Now any data product selected from the menus or the Product Browser should prompt you to select the exact time.","title":"Set Load Mode to Inventory"},{"location":"edex/case-studies/#set-data-display-time-in-cave","text":"At the bottom of the CAVE application, double-click the Time: entry to bring up a dialog window where you can set CAVE to a previous time, and choose the option of freezing CAVE at that time or allowing CAVE to \"move forward in time\" from that position as if it were real-time.","title":"Set Data Display Time in CAVE"},{"location":"edex/data-access-plugins/","text":"td:first-child { font-weight: bold } EDEX Data Access Plugins \uf0c1 EDEX plugins which provide access to datasets via the python-awips Data Access Framework: Data Type Plugin Name ACARS acars-dataaccess AIREPs airep-dataaccess Lightning binlightning-dataaccess BUFR MOS bufrmos-dataaccess BURF UA bufrua-dataaccess Climate DB climate-dataaccess Model Soundings modelsounding-dataaccess METAR obs-dataaccess PIREPs pirep-dataaccess Wine Profiler profiler-dataaccess Synop/Marine Obs sfcobs-dataaccess Warnings warning-dataaccess","title":"Data access plugins"},{"location":"edex/data-access-plugins/#edex-data-access-plugins","text":"EDEX plugins which provide access to datasets via the python-awips Data Access Framework: Data Type Plugin Name ACARS acars-dataaccess AIREPs airep-dataaccess Lightning binlightning-dataaccess BUFR MOS bufrmos-dataaccess BURF UA bufrua-dataaccess Climate DB climate-dataaccess Model Soundings modelsounding-dataaccess METAR obs-dataaccess PIREPs pirep-dataaccess Wine Profiler profiler-dataaccess Synop/Marine Obs sfcobs-dataaccess Warnings warning-dataaccess","title":"EDEX Data Access Plugins"},{"location":"edex/data-distribution-files/","text":"Data Distribution Files \uf0c1 Overview \uf0c1 EDEX uses distribution files to alert the appropriate decoding plug-in that new data has been recieved. These files do so by use of XML and regular expressions. If the WMO header, or file name*, matches a regular expression listed in a distribution XML, then EDEX will put a message into the QPID queue for its corresponding decoder to recognize and process. It is worth noting that more than one distribution file can recognize a single peice of data and notify their decoders to act. *Sometimes the distribution file will not look at the filename. If this file is coming in through the LDM using a proper FILE action, then the it is possible the distribution file will only look at the header and not the filename. If the file is ingested using the manual endpoint (/awips2/data_store/ingest/), then this behaviour could be different. If a piece of data does not match any distribution XML, EDEX will: Create an entry in /awips2/edex/logs/edex-ingest-unrecognized-files-yyyymmdd.log Skip processing of the unrecognized file. Distribution files are stored in the common_static branch of the Localization Store (series of directories that exist in /awips2/edex/data/utility/ ), and a list of available files can be found in the base-level directory. The base directory is: /awips2/edex/data/utility/common_static/base/distribution/ . For each plug-in, the distribution file is named [data-type].xml . For example, the distribution file for radar data is radar.xml . The distribution files follow the AWIPS base/site localization pattern: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution [root@edex distribution]# ls acars.xml goesr.xml poessounding.xml airep.xml goessounding.xml profiler.xml airmet.xml grib.xml radar.xml atcf.xml intlsigmet.xml redbook.xml aww.xml lsr.xml satellite.gini.xml ... Creating a Site Override \uf0c1 Base files are located in /awips2/edex/data/utility/common_static/base/distribution/ Site override distribution files are located in /awips2/edex/data/utility/common_static/ site/XXX /distribution/ , where XXX is the site identifier. Note that site-level files override the base files; as a result, local modifications to distribution files must be made as follows: The base distribution file must be copied from /awips2/edex/data/utility/common_static/base/distribution to /awips2/edex/data/utility/common_static/site/XXX/distribution The local modification must be made to the file in /awips2/edex/data/utility/common_static/site/XXX/distribution The basic structure of the distribution file is: [pattern] [pattern] In each tag, [pattern] is replaced with a regular expression that will match either the filename or the WMO header of the raw data. Only data that matches a pattern in the distribution file will be processed. The contents of the base version of the radar distribution file: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution/ [root@edex]# tail -4 radar.xml ^SDUS[234578]. .* ^Level3.* Looking at the base radar.xml distribution file in this example, there are two regular expressions. The first regular expression matches the standard WMO ID of radar products. Using edexBridge the LDM will place a message in the external.dropbox QPID queue, indicating a radar product has arrived. EDEX will then take the message containing the radar WMO ID (which comes from the file header) and compare it against the regular expressions in radar.xml. If a match is found, EDEX places a message in the QPID queue Ingest.radar. The radar decoder will then consume the message and process the radar data accordingly. Adding a REGEX to the Satellite Data Distribution File \uf0c1 As a quick example, suppose we have a local data source for satellite imagery that does not have a WMO header; also suppose that the data source writes to files whose names start with LOCAL.sat . To add this locally produced satellite data file to the EDEX distribution; perform the following steps. Copy the base version of satellite.gini.xml from the base distribution directory /awips2/edex/data/utility/common_static/base/distribution into the site distribution directory /awips2/edex/data/utility/common_static/site/XXX/distribution Edit the site version of satellite.gini.xml , adding a new tag immediately below the existing regular expression ( ) tag. The contents of the tag will be ^LOCAL.sat . The final result will be: TI[CGT]... .... rad/NEXRCOMP .\\*.gini.\\* ^LOCAL.sat.* Save the file and exit the editor. EDEX will automatically pick up the new distribution pattern. Raw files are written to subdirectories in /awips2/data_store/ , and a message is sent via QPID to the EDEX distribution service from the LDM. When a regular expression match is found in a data distribution file, the raw data file is placed in a queue for the matching plugin to decode and process. The distribution files are used to match file headers as well as filenames, which is how files dropped into EDEX's manual endpoint ( /awips2/data_store/ingest/ ) are processed. Editing an EDEX Data Distribution File \uf0c1 Because these files are in the common/_static directory, they have to be manually edited using a text editor. You should not edit the base files; rather, as stated above, you should copy the base version to your site and then edit the site version . The regular expressions in the distribution files do not necessarily need to correspond with the regular expressions in the LDM pqact.conf file. It is important to note that: The regex in the pqact.conf file applies to the productID that is passed through the LDM. and The regex in the distribution files (.xml) typically applies to the header in the file. It can also apply to the filename, if the file is coming through the manual endpoint, or if the data has no header to begin with. If patterns exist in pqact.conf but there are no corresponding matching regex expressions in any distribution file, then raw data files will be written to /awips2/data_store/ but will not be ingested and processed by EDEX. Entries for these non-ingested files would be written to the unrecognized files log in /awips/edex/logs . Examples \uf0c1 Surface Obs \uf0c1 Its distribution file is located at: /awips2/edex/data/utility/common_static/base/distribution/obs.xml : ^S[AP].* It will process any file header that starts with SA or SP , which should match any WMO header that contains METAR data (e.g. SAUS , SPUS , SACN , SAMX ). Text Data \uf0c1 Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/text.xml : ^[ACFNRUW][A-Z][A-Z0-9]{4} [A-Z0-9]{4} ^S[A-CEG-Z].* ^T[BCX].* ^SF[A-OQ-TV-Z].* ^SDUS1.* ^SDUS4[1-6].* ^SDUS9[^7].* ^SFU[^S].* ^SFUS4[^1].* ^SFP[^A].* ^SFPA[^4].* ^SFPA4[^12].* ^BMBB91.* ^N[A-Z][A-Z0-9]{4} [A-Z0-9]{4} ^F[EHIJKLMQVWX].* wcl_decrypted ecmwf_mos_decrypted Processes lots of WM patterns. The second pattern ^S[A-CEG-Z].* matches any header that starts with S except for SD or SF . This is because it matches A through C ( A-C ), E, and G through Z ( G-Z ). So it also matches the SA and SP files that the obs.xml plugin matches. This means that METARs are processed by both plugins simultaneously. Grib Data \uf0c1 Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/grib.xml : ^[EHLMOYZ][A-Z]{3}\\d{2} ^LZ[ABC][ABC]9[123] (KWBC|KNCF) ecmwf_decrypted \\p{Alpha}{3}_nwps_CG1 \\p{Alpha}{3}_nwps_CG0_Trkng .*grib.* .*GRIB.* .*grb.* ^US058.* ^CMC_reg.* The grib/grid decoder distribution file matches all numerical grids distributed over the IDD NGRID feed by matching WMO header, and from CONDUIT by matching various .grib file extensions. It also includes an example of a regexExclude message which can be used to single out matching values that aren't to be included. Addtional Information \uf0c1 Important notes about regular expressions: Any time a new entry is placed in the pqact.conf file on LDM, it is likely a corresponding entry needs to be added to the appropriate Data Distribution file in the data distribution directory, or the data file will be logged to edex-ingest-unrecognized-files-YYYYMMDD.log . The exception to this rule is if the new data coming from the LDM is a type of data that already exists and EDEX already has a distribution file with a matching regex that will recognize it. If you have written a new regex for a distribution file to match on a filename, and it is not matching, then the file most likely has a header. In this case EDEX will only look at the header to match the regex. You must change your regex to something that matches the header, not the filename.","title":"Data Distribution Files"},{"location":"edex/data-distribution-files/#data-distribution-files","text":"","title":"Data Distribution Files"},{"location":"edex/data-distribution-files/#overview","text":"EDEX uses distribution files to alert the appropriate decoding plug-in that new data has been recieved. These files do so by use of XML and regular expressions. If the WMO header, or file name*, matches a regular expression listed in a distribution XML, then EDEX will put a message into the QPID queue for its corresponding decoder to recognize and process. It is worth noting that more than one distribution file can recognize a single peice of data and notify their decoders to act. *Sometimes the distribution file will not look at the filename. If this file is coming in through the LDM using a proper FILE action, then the it is possible the distribution file will only look at the header and not the filename. If the file is ingested using the manual endpoint (/awips2/data_store/ingest/), then this behaviour could be different. If a piece of data does not match any distribution XML, EDEX will: Create an entry in /awips2/edex/logs/edex-ingest-unrecognized-files-yyyymmdd.log Skip processing of the unrecognized file. Distribution files are stored in the common_static branch of the Localization Store (series of directories that exist in /awips2/edex/data/utility/ ), and a list of available files can be found in the base-level directory. The base directory is: /awips2/edex/data/utility/common_static/base/distribution/ . For each plug-in, the distribution file is named [data-type].xml . For example, the distribution file for radar data is radar.xml . The distribution files follow the AWIPS base/site localization pattern: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution [root@edex distribution]# ls acars.xml goesr.xml poessounding.xml airep.xml goessounding.xml profiler.xml airmet.xml grib.xml radar.xml atcf.xml intlsigmet.xml redbook.xml aww.xml lsr.xml satellite.gini.xml ...","title":"Overview"},{"location":"edex/data-distribution-files/#creating-a-site-override","text":"Base files are located in /awips2/edex/data/utility/common_static/base/distribution/ Site override distribution files are located in /awips2/edex/data/utility/common_static/ site/XXX /distribution/ , where XXX is the site identifier. Note that site-level files override the base files; as a result, local modifications to distribution files must be made as follows: The base distribution file must be copied from /awips2/edex/data/utility/common_static/base/distribution to /awips2/edex/data/utility/common_static/site/XXX/distribution The local modification must be made to the file in /awips2/edex/data/utility/common_static/site/XXX/distribution The basic structure of the distribution file is: [pattern] [pattern] In each tag, [pattern] is replaced with a regular expression that will match either the filename or the WMO header of the raw data. Only data that matches a pattern in the distribution file will be processed. The contents of the base version of the radar distribution file: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution/ [root@edex]# tail -4 radar.xml ^SDUS[234578]. .* ^Level3.* Looking at the base radar.xml distribution file in this example, there are two regular expressions. The first regular expression matches the standard WMO ID of radar products. Using edexBridge the LDM will place a message in the external.dropbox QPID queue, indicating a radar product has arrived. EDEX will then take the message containing the radar WMO ID (which comes from the file header) and compare it against the regular expressions in radar.xml. If a match is found, EDEX places a message in the QPID queue Ingest.radar. The radar decoder will then consume the message and process the radar data accordingly.","title":"Creating a Site Override"},{"location":"edex/data-distribution-files/#adding-a-regex-to-the-satellite-data-distribution-file","text":"As a quick example, suppose we have a local data source for satellite imagery that does not have a WMO header; also suppose that the data source writes to files whose names start with LOCAL.sat . To add this locally produced satellite data file to the EDEX distribution; perform the following steps. Copy the base version of satellite.gini.xml from the base distribution directory /awips2/edex/data/utility/common_static/base/distribution into the site distribution directory /awips2/edex/data/utility/common_static/site/XXX/distribution Edit the site version of satellite.gini.xml , adding a new tag immediately below the existing regular expression ( ) tag. The contents of the tag will be ^LOCAL.sat . The final result will be: TI[CGT]... .... rad/NEXRCOMP .\\*.gini.\\* ^LOCAL.sat.* Save the file and exit the editor. EDEX will automatically pick up the new distribution pattern. Raw files are written to subdirectories in /awips2/data_store/ , and a message is sent via QPID to the EDEX distribution service from the LDM. When a regular expression match is found in a data distribution file, the raw data file is placed in a queue for the matching plugin to decode and process. The distribution files are used to match file headers as well as filenames, which is how files dropped into EDEX's manual endpoint ( /awips2/data_store/ingest/ ) are processed.","title":"Adding a REGEX to the Satellite Data Distribution File"},{"location":"edex/data-distribution-files/#editing-an-edex-data-distribution-file","text":"Because these files are in the common/_static directory, they have to be manually edited using a text editor. You should not edit the base files; rather, as stated above, you should copy the base version to your site and then edit the site version . The regular expressions in the distribution files do not necessarily need to correspond with the regular expressions in the LDM pqact.conf file. It is important to note that: The regex in the pqact.conf file applies to the productID that is passed through the LDM. and The regex in the distribution files (.xml) typically applies to the header in the file. It can also apply to the filename, if the file is coming through the manual endpoint, or if the data has no header to begin with. If patterns exist in pqact.conf but there are no corresponding matching regex expressions in any distribution file, then raw data files will be written to /awips2/data_store/ but will not be ingested and processed by EDEX. Entries for these non-ingested files would be written to the unrecognized files log in /awips/edex/logs .","title":"Editing an EDEX Data Distribution File"},{"location":"edex/data-distribution-files/#examples","text":"","title":"Examples"},{"location":"edex/data-distribution-files/#surface-obs","text":"Its distribution file is located at: /awips2/edex/data/utility/common_static/base/distribution/obs.xml : ^S[AP].* It will process any file header that starts with SA or SP , which should match any WMO header that contains METAR data (e.g. SAUS , SPUS , SACN , SAMX ).","title":"Surface Obs"},{"location":"edex/data-distribution-files/#text-data","text":"Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/text.xml : ^[ACFNRUW][A-Z][A-Z0-9]{4} [A-Z0-9]{4} ^S[A-CEG-Z].* ^T[BCX].* ^SF[A-OQ-TV-Z].* ^SDUS1.* ^SDUS4[1-6].* ^SDUS9[^7].* ^SFU[^S].* ^SFUS4[^1].* ^SFP[^A].* ^SFPA[^4].* ^SFPA4[^12].* ^BMBB91.* ^N[A-Z][A-Z0-9]{4} [A-Z0-9]{4} ^F[EHIJKLMQVWX].* wcl_decrypted ecmwf_mos_decrypted Processes lots of WM patterns. The second pattern ^S[A-CEG-Z].* matches any header that starts with S except for SD or SF . This is because it matches A through C ( A-C ), E, and G through Z ( G-Z ). So it also matches the SA and SP files that the obs.xml plugin matches. This means that METARs are processed by both plugins simultaneously.","title":"Text Data"},{"location":"edex/data-distribution-files/#grib-data","text":"Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/grib.xml : ^[EHLMOYZ][A-Z]{3}\\d{2} ^LZ[ABC][ABC]9[123] (KWBC|KNCF) ecmwf_decrypted \\p{Alpha}{3}_nwps_CG1 \\p{Alpha}{3}_nwps_CG0_Trkng .*grib.* .*GRIB.* .*grb.* ^US058.* ^CMC_reg.* The grib/grid decoder distribution file matches all numerical grids distributed over the IDD NGRID feed by matching WMO header, and from CONDUIT by matching various .grib file extensions. It also includes an example of a regexExclude message which can be used to single out matching values that aren't to be included.","title":"Grib Data"},{"location":"edex/data-distribution-files/#addtional-information","text":"Important notes about regular expressions: Any time a new entry is placed in the pqact.conf file on LDM, it is likely a corresponding entry needs to be added to the appropriate Data Distribution file in the data distribution directory, or the data file will be logged to edex-ingest-unrecognized-files-YYYYMMDD.log . The exception to this rule is if the new data coming from the LDM is a type of data that already exists and EDEX already has a distribution file with a matching regex that will recognize it. If you have written a new regex for a distribution file to match on a filename, and it is not matching, then the file most likely has a header. In this case EDEX will only look at the header to match the regex. You must change your regex to something that matches the header, not the filename.","title":"Addtional Information"},{"location":"edex/data-grids/","text":"Available IDD Grids \uf0c1 The file /awips2/ldm/etc/pqact.conf defines which grids the LDM will request for EDEX ingest. After editing this file (as user awips ) you should run ldmadmin pqactHUP to re-read the new edits ( ldmadmin restart will also work). GFS \uf0c1 GFS Global 0.25 degree \uf0c1 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.pgrb2.0p25.*!grib2/[^/]*.*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/GFS0p25/GFS_Global_0p25deg_\\1\\2\\3_\\4_\\5-(seq).grib2 GFS Global 1.0 degree \uf0c1 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.pgrb2.1p00.*!grib2/[^/]*/.*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/GFS1p0/GFS_Global_onedeg_\\1\\2\\3_\\4_\\5-(seq).grib2 GFS Global 2.5 degree \uf0c1 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.pgrb2.2p50.*!grib2/[^/]*/.*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/GFS2p5/GFS_Global_2p50deg_\\1\\2\\3_\\4_\\5-(seq).grib2 GFS Global 1.0 degree (NOAAPORT) \uf0c1 NGRID ^[YZ].P... KWBC ...... !grib2/ncep/GFS.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFS1p0_noaaport/GFS_Global_onedeg_noaaport_\\1_\\2_\\3_\\4.grib2 GFS Pacific 40 km \uf0c1 NGRID ^[LM].O... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFSPacific-40km/GFSPacific-40km_\\1_\\2_\\3_\\4-(seq).grib2 GFS Pacific 20 km Mercator \uf0c1 NGRID ^[YZ].F... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFSPacific-20km/GFSPacific-20km_\\1_\\2_\\3_\\4-(seq).grib2 GFS Puerto Rico 0.5 degree \uf0c1 NGRID ^[LM].T... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-GFS0p5/PR-GFS0p5_\\1_\\2_\\3_\\4-(seq).grib2 GFS Puerto Rico 20 km Lat/Lon 0.25 degree \uf0c1 NGRID ^[YZ].E... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-GFS0p25/PR-GFS0p25_\\1_\\2_\\3_\\4-(seq).grib2 GFS CONUS 80 km \uf0c1 HDS ^[YZ].Q... KWBC ...... !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFS80/GFS80_\\1_\\2_\\3_\\4-(seq).grib2 GFS Alaska 95 km (GFS95) \uf0c1 NGRID ^[LM].H... KWBC ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-GFS95/AK-GFS95\\1_\\2Z_\\3_\\4-(seq).grib2 GFS CONUS 20 km (GFS20) \uf0c1 NGRID ^[YZ].N... KWBC ...... !grib.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFS20/GFS20_\\1_\\2_\\3_\\4-(seq).grib2 GFS Alaska 20 km (AK-GFS22) \uf0c1 NGRID ^[YZ].B... KWBC ...... !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-GFS20/AK-GFS20_\\1_\\2_\\3_\\4-(seq).grib2 GFSGuide \uf0c1 NGRID ^[LM].I... KWBJ ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFSGuide/GFSGuide_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 RTMA and URMA \uf0c1 RTMA 197 (5km) \uf0c1 NGRID ^[LM].M... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RTMA5/RTMA_5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 RTMA-Mosaic 2.5km (I) \uf0c1 NGRID ^[LM].I... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RTMA/RTMA_2p5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 URMA2.5 (Q) \uf0c1 NGRID ^[LM].Q... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/URMA25/URMA_2p5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 NAM \uf0c1 NAM CONUS 12 km (NAM12) - NOAAport \uf0c1 NGRID ^[LM].B... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM12/noaaport/NAM_CONUS_12km_noaaport_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 NAM Alaska 11 km (AK-NAM11) \uf0c1 NGRID ^[LM].S... KWBE ...... !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM11/NAM_Alaska_11km_\\1_\\2_\\3_\\4-(seq).grib2 NAM Alaska 45 km GRID216 - CONDUIT (AK-NAM45) \uf0c1 CONDUIT ^data/nccf/com/nam/prod/nam.*t(..)z.awipak.* !grib2/ncep/NAM_84/#000/(............)(F...)/(.*)/.* FILE -edex -log /awips2/data_store/grid/AK-NAM45/conduit/NAM_Alaska_45km_conduit_\\2_\\3_\\4_\\5-(seq).grib2 NAM CONUS 12 km (NAM12) - CONDUIT \uf0c1 CONDUIT ^data/nccf/com/nam/.*nam.*awip12.*!grib2/ncep/NAM_84/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2 NAM CONUS 40 km (NAM40) - CONDUIT \uf0c1 CONDUIT ^data/nccf/com/nam/.*awip3d.*!grib2/ncep/NAM_84/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAM40/conduit/NAM_CONUS_40km_conduit_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2 NAM CONUS 40 km (NAM40) - NOAAport \uf0c1 HDS ^[YZ].[A-WYZ].{1,3} KWBD ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM40/noaaport/NAM_CONUS_40km_noaaport_\\1\\2_\\3_\\4-(seq).grib NAM Alaska 95 km (AK-NAM95) \uf0c1 HDS ^[YZ].N... KWBE .......*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM95/noaaport/NAM_Alaska_95km_\\2_\\3_\\4_\\5-(seq).grib2 NAM CONUS 80 km (NAM80) \uf0c1 HDS ^[YZ].Q... KWB. [0-3][0-9][0-2][0-9].*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM80/NAM_CONUS_80km_\\1_\\2Z_\\3_\\4-(seq).grib2 NAM CONUS 20 km (NAM20) (removed??) \uf0c1 HDS ^[YZ].U... KWB. [0-3][0-9][0-2][0-9].*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM20/NAM_CONUS_20km_\\2_\\3Z_\\4_\\5-(seq).\\1.grib2 NAM Alaska 45 km GRID216 - NOAAport (AK-NAM45) \uf0c1 HDS ^[YZ].V... KWB. .......*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM45/noaaport/NAM_Alaska_45km_noaaport_\\2_\\3_\\4_\\5-(seq).grib2 NAM Alaska 22 km (AK-NAM22) \uf0c1 HDS ^[YZ].Y... KWBE .......*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM22/NAM_Alaska_22km_\\2_\\3_\\4_\\5-(seq).grib2 NAM Puerto Rico Grid 237 (PR-NAM) \uf0c1 HDS ^[YZ].Z.{1,3} KWBE .......*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-NAM12/GRID\\8/Regional_NAM_GRID\\8_\\2_\\3_\\4_\\5-(seq).grib NAM Polar 90 km \uf0c1 CONDUIT ^data/nccf/com/nam/prod/nam........./nam.t..z.grbgrd.*NAM_84/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAM90/NAM_Polar_90km_\\1_\\2Z_\\3_\\4.\\5.grib2 NAM Fire Weather Nest \uf0c1 CONDUIT ^data/nccf/com/nam/prod/nam........./nam.t..z.firewxnest.*NMM_89/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAMFirewxnest/NAM_Firewxnest_\\1_\\2Z_\\3_\\4.\\5.grib2 NamDNG 2.5 km NGRID (NamDNG) \uf0c1 NGRID ^[LM].I... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NamDNG/ngrid/NamDNG_2p5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 NamDNG 5 km (NamDNG5) \uf0c1 NGRID ^[LM].M... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NamDNG5/NamDNG_5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 AK-NamDNG5 \uf0c1 HDS ^[LM].A.{1,3} KWB. .......*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NamDNG5/AK-NamDNG5_\\1_\\2\\3\\4-(seq).grib2 PR-NamDNG5 \uf0c1 HDS ^[LM].C.{1,3} KWB. .......*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-NamDNG5/PR-NamDNG5_\\1_\\2\\3\\4-(seq).grib2 Hawaii-NamDNG5 \uf0c1 HDS ^[LM].H.{1,3} KWB. .......*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/Hawaii-NamDNG5/Hawaii-NamDNG5_\\1_\\2\\3\\4-(seq).grib2 AK NamDNG 3km \uf0c1 HDS ^[LM].K.{1,3} KWB. .......*/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NamDNG-3km/AK-NamDNG-3km_\\1_\\2\\3\\4.grib2 RAP \uf0c1 RAP CONUS 13 km (RAP13) \uf0c1 NGRID ^[LM].D... KWBG ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RAP13/RR_CONUS_13km_\\3_\\4Z_\\5_\\6-(seq).grib2 RAP CONUS 20 km (RAP20) \uf0c1 CONDUIT ^data/nccf/com/rap.*awp252.*!grib2/ncep/RUC2/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/RAP20/RR_CONUS_20km_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2 RAP CONUS 40 km (RAP40) - NOAAport \uf0c1 HDS ^[YZ].W.{1,3} KWBG ......[^!]*!grib/ncep/RUC2/#236/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RAP40/noaaport/RR_CONUS_40km_noaaport_\\1_\\2Z_\\3_\\4-(seq).grib RAP CONUS 40 km (RAP40) - CONDIUT \uf0c1 CONDUIT ^data/nccf/com/rap.*awp236.*!grib2/ncep/RUC2/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/RAP40/conduit/RR_CONUS_40km_conduit_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2 HRRR \uf0c1 HRRR - NOAAport 1hr \uf0c1 NGRID Y.C.[0-9][0-9] KWBY ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -edex -log /awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_\\1_F\\2_\\3-(seq).grib2 HRRRX - Experimental GSD Hourly and Sub-Hourly \uf0c1 FSL2 ^GRIB2\\.FSL\\.HRRR\\.(.......)_Lambert\\.(.*)(Minute|Hour)\\.(.*)\\.(.*)\\.([0-9]{12}).* FILE -edex -log /awips2/data_store/grid/HRRRX/HRRRX_CONUS_3km_\\6-\\1_\\3\\2_\\4_\\5-(seq).grib2 SHEF \uf0c1 SREF CONUS 40 km Ensemble Derived Products \uf0c1 NGRID ^[LM].R... KWBL ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -edex -log /awips2/data_store/grid/SREF212/noaaport/SREF_CONUS_40km_ensprod_(\\1:yyyy)(\\1:mm)\\1_\\2.grib2 SREF CONUS 40 km Bias Corrected Ensemble Derived Products \uf0c1 CONDUIT ^data/nccf/com/sref/prod/sref\\........./../(ensprod_biasc)/.*pgrb212.*!grib2/ncep/.*/#000/(............)(F...)/(.*)/.*! (......) FILE -edex -log /awips2/data_store/grid/SREF212/conduit/SREF_CONUS_40km_\\1_\\2_\\3_\\4_\\5-(seq).grib2 SREF CONUS 40 km Bias Corrected Ensemble Members \uf0c1 CONDUIT ^data/nccf/com/sref/prod/sref\\.(........)/(..)/(pgrb_biasc)/.*pgrb212.*!grib2/ncep/.*/#000/............(F...)/(.*)/.*! (......) FILE -edex -log /awips2/data_store/grid/SREF212/conduit/SREF_CONUS_40km_\\3_\\1_\\200_\\4_\\5_\\6.grib2 SREF Alaska 45 km Ensemble Derived Products \uf0c1 NGRID ^[LM].V... KWBL .......*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-SREF45/SREF_Alaska_45km_ensprod_\\1\\2_\\3_\\4-(seq).grib2 SREF Pacific Northeast 0.4 degree Ensemble Derived Products \uf0c1 NGRID ^[LM].X... KWBL .......*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/EPac-SREF/SREF_PacificNE_0p4_ensprod_\\1\\2_\\3_\\4-(seq).grib2 FNMOC \uf0c1 Navy Coupled Ocean Data Assimilation (NCODA) - Global 0.25 degree \uf0c1 FNMOC ^US058.*0078_0200_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-NCODA/fnmoc_NCODA_Global_Ocean_\\1_\\2_\\3_\\4.grib NAVy Global Environmental Model (NAVGEM) - 0.5 degree \uf0c1 FNMOC ^US058.*0018_0056_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-NAVGEM/fnmoc_NAVGEM_Global_0p5deg_\\1_\\2_\\3_\\4.grib Forecast of Aerosol Radiative Optical Properties (FAROP) - Global 1.0 degree \uf0c1 FNMOC ^US058.*0135_0240_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-FAROP/fnmoc_FAROP_Global_1p0deg_\\1_\\2_\\3_\\4.grib WAVEWATCH III (WW3) - Global 1.0 degree \uf0c1 FNMOC ^US058.*0110_0240_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-WW3-Global_1p0deg/fnmoc_WW3_Global_1p0deg_\\1_\\2_\\3_\\4.grib WW3 Europe \uf0c1 FNMOC ^US058.{4}-GR1dyn\\.WW3-EURO_EURO-..-.._.{5}.{4}(.{4})(..)(..)(..)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-WW3-Europe/fnmoc_WW3_Europe_\\1\\2\\3_\\400_\\5.grib Coupled Ocean/Atmospheric Mesoscale Prediction System (COAMPS) - Western Atlantic \uf0c1 FNMOC ^US058.{4}-GR1dyn\\.COAMPS-NWATL_.{5}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Western_Atlantic/fnmoc_COAMPS_Western_Atlantic_\\1\\2\\3-(seq).grib COAMPS Europe \uf0c1 FNMOC ^US058.{4}-GR1dyn\\.COAMPS-EURO_.{4}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Europe/fnmoc_COAMPS_Europe_\\1\\2\\3-(seq).grib COAMPS Equatorial America \uf0c1 FNMOC ^US058.{4}-GR1dyn\\.COAMPS-EQAM_.{4}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Equatorial_America/fnmoc_COAMPS_Equatorial_America_\\1\\2\\3-(seq).grib COAMPS Northeast Pacific \uf0c1 FNMOC ^US058.{4}-GR1dyn\\.COAMPS-NEPAC_.{5}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Northeast_Pacific/fnmoc_COAMPS_Northeast_Pacific_\\1\\2\\3-(seq).grib COAMPS Southern California \uf0c1 FNMOC ^US058.{4}-GR1dyn\\.COAMPS-SOCAL_.{5}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Southern_California/fnmoc_COAMPS_Southern_California_\\1\\2\\3-(seq).grib Multi-Radar Multi-Sensor (MRMS) - NOAAport \uf0c1 Full Feed \uf0c1 NGRID ^YAU[CDLMPQS].. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Precipitation Products \uf0c1 NGRID ^YAU[DP].. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-precip/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Model Parameters (on different grid) \uf0c1 NGRID ^YAUM.. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-model/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Lightning Products from NLDN \uf0c1 NGRID ^YAUL.. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-lightning/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Rotation Track Products (on different grid) \uf0c1 NGRID ^YAUS0[0-4] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-rotation/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Mid-level Rotation Track Products (on different grid) \uf0c1 NGRID ^YAUS0[5-9] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-rotation-ml/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Merged Base Reflectivity \uf0c1 NGRID ^YAUQ.. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-merged/MRMS_\\1_\\2_\\3_\\4.grib2 MRMS Radar Products \uf0c1 NGRID ^YAU(C[0-9]|S[1-9])[0-9] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-radar/MRMS_\\2_\\3_\\4_\\5.grib2 MRMS Anything else (mainly future proofing) \uf0c1 NGRID ^YAU[ABE-KNORT-Z][0-9][0-9] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-other/MRMS_\\1_\\2_\\3_\\4.grib2 ECMWF \uf0c1 ECMF-Global, ECMF1..ECMF12 \uf0c1 HDS ^H..... ECM. ......[^!]*!grib.*/[^/]*/[^/]*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ECMWF/ECMWF-\\1_\\2_\\3_\\4_\\5-(seq).grib Other \uf0c1 Canadian GEM Regional Model (CMC) \uf0c1 CMC ^CMC_reg_(.*)km_(..........)_P(...).grib2 FILE -edex -log /awips2/data_store/grid/CMC/CMC_reg_\\1km_\\2_P\\3.grib2 UKMET Global \uf0c1 HDS ^H..... EGRR ......[^!]*!grib/ukmet/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/UKMET-\\1-GRID\\2/UKMET-\\1-GRID\\2_\\3\\4_\\5_\\6-(seq).grib1 National Precipitation Verification Unit (NPVU) - RFC Multisensor Precipitation Estimates (MPE) (MPE-Local-..., MPE-Mosaic-...) \uf0c1 HDS ^ZETA98 (....) ([0-3][0-9])([0-2][0-9]).*/m(.......) FILE -edex -log /awips2/data_store/grid/MPE-\\1/MPE-\\1_\\4_(\\2:yyyy)(\\2:mm)\\2.grib Automated Satellite Precipitation Estimates from NESDIS (hourly) (AUTOSPE) \uf0c1 HDS ^ZETA98 K[NW][EN][ES] ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/([^/]*)/([^/]*) FILE -edex -log /awips2/data_store/grid/AUTOSPE/AUTOSPE-\\1\\2_\\3_\\4-(seq).grib2 River Forecast Center (RFC) Quantitative Precipitation Estimation (QPE) \uf0c1 HDS ^ZETA98 (KTUA|PACR|KSTR|KRSA|KORN|KRHA|KKRF|KMSR|KTAR|KPTR|KTIR|KALR|KFWR) ......[^!]*!(grib|grib2)/[^/]*/([^/]*)/#255/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/QPE-RFC/QPE-RFC-\\1_\\3_\\4_\\5_\\6_\\7-(seq).grib Ocean Sea Surface Temperature (SST) Grids #61-64 \uf0c1 HDS ^H.[T-W] FILE -edex -log /awips2/data_store/grid/SST/%Y%m%d%H%M.sst.grib HPCGuide \uf0c1 NGRID ^([LM][ABCDFGH]U...) (KWBN) (..)(..)(..)[^!]*!(grib|grib2)/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCGuide/GRID\\8/HPCGuide_GRID\\8_\\(10)Z_\\(11)_\\(12)-\\1_\\2_\\3\\4\\5-(seq).grib2 National Convective Weather Forecast (NCWF) \uf0c1 ANY ^ZDIA98 (....) ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3}) FILE -edex -log /awips2/data_store/grid/NCWF/NCWF_\\2_\\3_\\4_\\1-(seq).grib National Operational Hydrologic Remote Sensing Center Snow Analysis (NOHRSC-SNOW) \uf0c1 HDS ^[YZ][ES]QA88 KMSR ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NOHRSC-SNOW/NOHRSC-SNOW_\\1_\\2_\\3_\\4-(seq).grib GFS MOS-Based Localized Aviation MOS Program (LAMP) guidance - LAMP2p5, GFSLAMP5 \uf0c1 NGRID ^([LM].[ABDHMNRSTU].{1,3}) (KWNO|KMDL) (..)(..)(..)[^!]*!(grib2)/[^/]*/(LAMP)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/LAMP/\\7_\\9_\\(10)Z_\\(11)_\\(12)-\\1_\\2-(seq).grib2 Radar Coded Messages (RCM) \uf0c1 HDS ^HAXA00 KWBC ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RCM/RCM_\\1_\\2_\\3_\\4-(seq).grib National Digital Forecast Database (NDFD) \uf0c1 CONDUIT grib2/nwstg/NWS_0/..../(........)(....)(F...)/(.*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NDFD/NDFD_\\1\\2\\3_\\4-\\5.grib2 NDFD WPC Quantitative Precipitation Forecast (HPCqpfNDFD) \uf0c1 NGRID ^[LM].[MN].98 KWNH ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCqpf-ngrid/HPCqpf_\\1_\\2_\\3_\\4-(seq).grib2 NDFD WPC Day 1-3 Excessive Rainfall Outlook (HPCqpfNDFD) \uf0c1 HDS ^[LM].[MN].98 KWNH ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCqpf-hds/HPCqpf_\\1_\\2_\\3_\\4-(seq).grib MOSGuide/MOSGuideExtended/GMOS \uf0c1 NGRID ^(Y.UZ9[0-9]) (KWB.) (..)(..)(..) FILE -edex -log /awips2/data_store/grid/MOSGuide/MOSGuide_\\1_\\2_\\3\\4\\5-(seq).grib2 Flash Flood Guidance (FFG) grids - 1HR=HPBL, 3HR=5WAVH, 6HR=CNWAT (FFG-PTR...) \uf0c1 HDS ^ZEGZ98 K.{3} .......*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/FFG/FFG-\\1_\\2_\\3_\\4_\\5-(seq).grib PROB3HR/#236 \uf0c1 HDS ^Z[DE]W[A-D][89]8 KWNO ...... /m0 !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PROB3HR/PROB3HR_\\1_\\2_\\3_\\4-(seq).grib National Data Buoy Center (NDBC) High Frequency Radar (HFR) Total Vector Velocity (TVV) \uf0c1 NGRID ^OUTA98 KWNB (......)[^!]*!grib2 FILE -edex -log /awips2/data_store/grid/HFR/HFR_\\1-(seq).grib2 Regional River Forecast Cebter (RFC) Quantitative Precipitation Forecast (QPF) (RFCqpf) \uf0c1 HDS ^YEI.[89]8 (KALR|KFWR|KKRF|KMSR|KORN|KPTR|KRHA|KRSA|KSTR|KTAR|KTIR|KTUA) .......*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/[^/]*/([^/]*) FILE -edex -log /awips2/data_store/grid/Regional_RFC_QPF/GRID\\2/\\3_\\4_\\5_\\6-(seq).grib GRID218 = HPCqpf \uf0c1 HDS ^(ZEX.98) KWNH (..)(..)(..)[^!]*!(grib|grib2)/[^/]*/([^/]*)/#218/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCqpf/HPCqpf_\\9Z_\\(10)_\\(11)-\\1_KWNH_\\2\\3\\4-(seq).grib Regional River Forecast Cebter (RFC) Quantitative Precipitation Forecast (QPF) \uf0c1 HDS ^(YEI.[89]8) KWNH (..)(..)(..)[^!]*!(grib|grib2)/ncep/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RFC_QPF_GRID\\7/\\9Z_\\(10)_\\(11)-\\1_KWNH_\\2\\3\\4-(seq).grib Ocean Models \uf0c1 WAVE 233 Grid - Global NOAA WAVEWATCH III (WaveWatch) \uf0c1 HDS ^O.J.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-WaveWatch/WaveWatch_\\1_F\\2_\\3_\\4_%H%M%S.grib WAVE 238 Grid - Regional Western North Atlantic (WAVE-WNA) \uf0c1 HDS ^O.M.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-WNAWAVE238/WNAWAVE238_\\1_F\\2_\\3_\\4_%H%M%S.grib WAVE 238 Grid - Regional Western North Atlantic Hurricane (WAVE-WNA-Hurr) \uf0c1 HDS ^O.O.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-HurWave238/HurWave238_\\1_F\\2_\\3_\\4_%H%M%S.grib WAVE 239 Grid - Regional Alaska Waters (WAVE-AK) \uf0c1 HDS ^O.N.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-AKWAVE239/AKWAVE239_\\1_F\\2_\\3_\\4_%H%M%S.grib WAVE 253 Grid - Regional Eastern North Pacific (WAVE-ENP) \uf0c1 HDS ^O.S.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-ENPWAVE253/ENPWAVE253_\\1_F\\2_\\3_\\4_%H%M%S.grib WAVE 253 Grid - Regional Eastern North Pacific Hurricane (WAVE-ENP-Hurr) \uf0c1 HDS ^O.Q.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-HurWave253/HurWave253\\1_F\\2_\\3_\\4_%H%M%S.grib WW3 Global \uf0c1 NGRID E.A.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Global/WW3_Global_\\1_\\200.grib2 WW3 Regional Alaska \uf0c1 NGRID E.E.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_Alaska/WW3_Regional_Alaska_\\1_\\200.grib2 WW3 Coastal Alaska \uf0c1 NGRID E.F.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Coastal_Alaska/WW3_Coastal_Alaska_\\1_\\200.grib2 WW3 Eastern Pacific (Regional) \uf0c1 NGRID E.D.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_Eastern_Pacific/WW3_Regional_Eastern_Pacific_\\1_\\200.grib2 WW3 US East Coast (Regional) \uf0c1 NGRID E.B.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_US_East_Coast/WW3_Regional_US_East_Coast_\\1_\\200.grib2 WW3 US East Coast (Coastal) \uf0c1 NGRID E.H.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Coastal_US_East_Coast/WW3_Coastal_US_East_Coast_\\1_\\200.grib2 WW3 US West Coast (Regional) \uf0c1 NGRID E.C.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_US_West_Coast/WW3_Regional_US_West_Coast_\\1_\\200.grib2 WW3 US West Coast (Coastal) \uf0c1 NGRID E.G.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Coastal_US_West_Coast/WW3_Coastal_US_West_Coast_\\1_\\200.grib2 ESTOFS - US \uf0c1 NGRID ^E[EHC]I[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS/ESTOFS_\\1_\\2_\\3_\\4-(seq).grib ESTOFS - Puerto Rico \uf0c1 NGRID ^E[EHC]P[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-PR/ESTOFS-PR_\\1_\\2_\\3_\\4-(seq).grib ESTOFS Pacific - Alaska \uf0c1 NGRID ^E[EHC]A[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-AK/ESTOFS-AK_\\1_\\2_\\3_\\4-(seq).grib ESTOFS Pacific - CONUS (West Coast) \uf0c1 NGRID ^E[EHC]D[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-WC/ESTOFS-WC_\\1_\\2_\\3_\\4-(seq).grib ESTOFS Pacific - Hawaii \uf0c1 NGRID ^E[EHC]H[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-HI/ESTOFS-HI_\\1_\\2_\\3_\\4-(seq).grib Extra-Tropical Storm Surge (ETSS) \uf0c1 NGRID ^MHU... KNHC (..)(..)(..) FILE -edex -log /awips2/data_store/grid/ETSS/ETSS_\\1\\2\\3-(seq).grib2 GLERL \uf0c1 HDS ^O.N.88 KWNB.*!grib/161/([^/]*)/#([^/]*)/(............)/F(...)/.* FILE -edex -log /awips2/data_store/grid/GLERL/GLERL_\\1_F\\2_%H%M%S.grib Important Files and Directories \uf0c1 |---|---| | location on disk | /awips2/edex/data/hdf5/grid | | definition files | /awips2/edex/data/utility/edex_static/base/grib/models | | navigation files | /awips2/edex/data/utility/edex_static/base/grib/grids | | grib1 definitions | /awips2/edex/data/utility/common_static/base/grid | | D2D files | /awips2/edex/data/utility/edex_static/base/grib/grids | | metadata tables | grid | | | grid_info | | | gridcoverage |","title":"Available IDD Grids"},{"location":"edex/data-grids/#available-idd-grids","text":"The file /awips2/ldm/etc/pqact.conf defines which grids the LDM will request for EDEX ingest. After editing this file (as user awips ) you should run ldmadmin pqactHUP to re-read the new edits ( ldmadmin restart will also work).","title":"Available IDD Grids"},{"location":"edex/data-grids/#gfs","text":"","title":"GFS"},{"location":"edex/data-grids/#gfs-global-025-degree","text":"CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.pgrb2.0p25.*!grib2/[^/]*.*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/GFS0p25/GFS_Global_0p25deg_\\1\\2\\3_\\4_\\5-(seq).grib2","title":"GFS Global 0.25 degree"},{"location":"edex/data-grids/#gfs-global-10-degree","text":"CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.pgrb2.1p00.*!grib2/[^/]*/.*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/GFS1p0/GFS_Global_onedeg_\\1\\2\\3_\\4_\\5-(seq).grib2","title":"GFS Global 1.0 degree"},{"location":"edex/data-grids/#gfs-global-25-degree","text":"CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.pgrb2.2p50.*!grib2/[^/]*/.*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/GFS2p5/GFS_Global_2p50deg_\\1\\2\\3_\\4_\\5-(seq).grib2","title":"GFS Global 2.5 degree"},{"location":"edex/data-grids/#gfs-global-10-degree-noaaport","text":"NGRID ^[YZ].P... KWBC ...... !grib2/ncep/GFS.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFS1p0_noaaport/GFS_Global_onedeg_noaaport_\\1_\\2_\\3_\\4.grib2","title":"GFS Global 1.0 degree (NOAAPORT)"},{"location":"edex/data-grids/#gfs-pacific-40-km","text":"NGRID ^[LM].O... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFSPacific-40km/GFSPacific-40km_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS Pacific 40 km"},{"location":"edex/data-grids/#gfs-pacific-20-km-mercator","text":"NGRID ^[YZ].F... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFSPacific-20km/GFSPacific-20km_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS Pacific 20 km Mercator"},{"location":"edex/data-grids/#gfs-puerto-rico-05-degree","text":"NGRID ^[LM].T... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-GFS0p5/PR-GFS0p5_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS Puerto Rico 0.5 degree"},{"location":"edex/data-grids/#gfs-puerto-rico-20-km-latlon-025-degree","text":"NGRID ^[YZ].E... KWBC ...... !grib2.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-GFS0p25/PR-GFS0p25_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS Puerto Rico 20 km Lat/Lon 0.25 degree"},{"location":"edex/data-grids/#gfs-conus-80-km","text":"HDS ^[YZ].Q... KWBC ...... !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFS80/GFS80_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS CONUS 80 km"},{"location":"edex/data-grids/#gfs-alaska-95-km-gfs95","text":"NGRID ^[LM].H... KWBC ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-GFS95/AK-GFS95\\1_\\2Z_\\3_\\4-(seq).grib2","title":"GFS Alaska 95 km (GFS95)"},{"location":"edex/data-grids/#gfs-conus-20-km-gfs20","text":"NGRID ^[YZ].N... KWBC ...... !grib.*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFS20/GFS20_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS CONUS 20 km (GFS20)"},{"location":"edex/data-grids/#gfs-alaska-20-km-ak-gfs22","text":"NGRID ^[YZ].B... KWBC ...... !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-GFS20/AK-GFS20_\\1_\\2_\\3_\\4-(seq).grib2","title":"GFS Alaska 20 km (AK-GFS22)"},{"location":"edex/data-grids/#gfsguide","text":"NGRID ^[LM].I... KWBJ ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/GFSGuide/GFSGuide_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"GFSGuide"},{"location":"edex/data-grids/#rtma-and-urma","text":"","title":"RTMA and URMA"},{"location":"edex/data-grids/#rtma-197-5km","text":"NGRID ^[LM].M... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RTMA5/RTMA_5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"RTMA 197 (5km)"},{"location":"edex/data-grids/#rtma-mosaic-25km-i","text":"NGRID ^[LM].I... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RTMA/RTMA_2p5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"RTMA-Mosaic 2.5km (I)"},{"location":"edex/data-grids/#urma25-q","text":"NGRID ^[LM].Q... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/URMA25/URMA_2p5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"URMA2.5 (Q)"},{"location":"edex/data-grids/#nam","text":"","title":"NAM"},{"location":"edex/data-grids/#nam-conus-12-km-nam12-noaaport","text":"NGRID ^[LM].B... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM12/noaaport/NAM_CONUS_12km_noaaport_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"NAM CONUS 12 km (NAM12) - NOAAport"},{"location":"edex/data-grids/#nam-alaska-11-km-ak-nam11","text":"NGRID ^[LM].S... KWBE ...... !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM11/NAM_Alaska_11km_\\1_\\2_\\3_\\4-(seq).grib2","title":"NAM Alaska 11 km (AK-NAM11)"},{"location":"edex/data-grids/#nam-alaska-45-km-grid216-conduit-ak-nam45","text":"CONDUIT ^data/nccf/com/nam/prod/nam.*t(..)z.awipak.* !grib2/ncep/NAM_84/#000/(............)(F...)/(.*)/.* FILE -edex -log /awips2/data_store/grid/AK-NAM45/conduit/NAM_Alaska_45km_conduit_\\2_\\3_\\4_\\5-(seq).grib2","title":"NAM Alaska 45 km GRID216 - CONDUIT (AK-NAM45)"},{"location":"edex/data-grids/#nam-conus-12-km-nam12-conduit","text":"CONDUIT ^data/nccf/com/nam/.*nam.*awip12.*!grib2/ncep/NAM_84/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2","title":"NAM CONUS 12 km (NAM12) - CONDUIT"},{"location":"edex/data-grids/#nam-conus-40-km-nam40-conduit","text":"CONDUIT ^data/nccf/com/nam/.*awip3d.*!grib2/ncep/NAM_84/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAM40/conduit/NAM_CONUS_40km_conduit_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2","title":"NAM CONUS 40 km (NAM40) - CONDUIT"},{"location":"edex/data-grids/#nam-conus-40-km-nam40-noaaport","text":"HDS ^[YZ].[A-WYZ].{1,3} KWBD ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM40/noaaport/NAM_CONUS_40km_noaaport_\\1\\2_\\3_\\4-(seq).grib","title":"NAM CONUS 40 km (NAM40) - NOAAport"},{"location":"edex/data-grids/#nam-alaska-95-km-ak-nam95","text":"HDS ^[YZ].N... KWBE .......*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM95/noaaport/NAM_Alaska_95km_\\2_\\3_\\4_\\5-(seq).grib2","title":"NAM Alaska 95 km (AK-NAM95)"},{"location":"edex/data-grids/#nam-conus-80-km-nam80","text":"HDS ^[YZ].Q... KWB. [0-3][0-9][0-2][0-9].*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM80/NAM_CONUS_80km_\\1_\\2Z_\\3_\\4-(seq).grib2","title":"NAM CONUS 80 km (NAM80)"},{"location":"edex/data-grids/#nam-conus-20-km-nam20-removed","text":"HDS ^[YZ].U... KWB. [0-3][0-9][0-2][0-9].*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NAM20/NAM_CONUS_20km_\\2_\\3Z_\\4_\\5-(seq).\\1.grib2","title":"NAM CONUS 20 km (NAM20) (removed??)"},{"location":"edex/data-grids/#nam-alaska-45-km-grid216-noaaport-ak-nam45","text":"HDS ^[YZ].V... KWB. .......*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM45/noaaport/NAM_Alaska_45km_noaaport_\\2_\\3_\\4_\\5-(seq).grib2","title":"NAM Alaska 45 km GRID216 - NOAAport (AK-NAM45)"},{"location":"edex/data-grids/#nam-alaska-22-km-ak-nam22","text":"HDS ^[YZ].Y... KWBE .......*/m(ETA|NAM) !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NAM22/NAM_Alaska_22km_\\2_\\3_\\4_\\5-(seq).grib2","title":"NAM Alaska 22 km (AK-NAM22)"},{"location":"edex/data-grids/#nam-puerto-rico-grid-237-pr-nam","text":"HDS ^[YZ].Z.{1,3} KWBE .......*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-NAM12/GRID\\8/Regional_NAM_GRID\\8_\\2_\\3_\\4_\\5-(seq).grib","title":"NAM Puerto Rico Grid 237 (PR-NAM)"},{"location":"edex/data-grids/#nam-polar-90-km","text":"CONDUIT ^data/nccf/com/nam/prod/nam........./nam.t..z.grbgrd.*NAM_84/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAM90/NAM_Polar_90km_\\1_\\2Z_\\3_\\4.\\5.grib2","title":"NAM Polar 90 km"},{"location":"edex/data-grids/#nam-fire-weather-nest","text":"CONDUIT ^data/nccf/com/nam/prod/nam........./nam.t..z.firewxnest.*NMM_89/#[^/]*/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NAMFirewxnest/NAM_Firewxnest_\\1_\\2Z_\\3_\\4.\\5.grib2","title":"NAM Fire Weather Nest"},{"location":"edex/data-grids/#namdng-25-km-ngrid-namdng","text":"NGRID ^[LM].I... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NamDNG/ngrid/NamDNG_2p5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"NamDNG 2.5 km NGRID (NamDNG)"},{"location":"edex/data-grids/#namdng-5-km-namdng5","text":"NGRID ^[LM].M... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NamDNG5/NamDNG_5km_\\1_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"NamDNG 5 km (NamDNG5)"},{"location":"edex/data-grids/#ak-namdng5","text":"HDS ^[LM].A.{1,3} KWB. .......*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NamDNG5/AK-NamDNG5_\\1_\\2\\3\\4-(seq).grib2","title":"AK-NamDNG5"},{"location":"edex/data-grids/#pr-namdng5","text":"HDS ^[LM].C.{1,3} KWB. .......*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PR-NamDNG5/PR-NamDNG5_\\1_\\2\\3\\4-(seq).grib2","title":"PR-NamDNG5"},{"location":"edex/data-grids/#hawaii-namdng5","text":"HDS ^[LM].H.{1,3} KWB. .......*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/Hawaii-NamDNG5/Hawaii-NamDNG5_\\1_\\2\\3\\4-(seq).grib2","title":"Hawaii-NamDNG5"},{"location":"edex/data-grids/#ak-namdng-3km","text":"HDS ^[LM].K.{1,3} KWB. .......*/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-NamDNG-3km/AK-NamDNG-3km_\\1_\\2\\3\\4.grib2","title":"AK NamDNG 3km"},{"location":"edex/data-grids/#rap","text":"","title":"RAP"},{"location":"edex/data-grids/#rap-conus-13-km-rap13","text":"NGRID ^[LM].D... KWBG ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RAP13/RR_CONUS_13km_\\3_\\4Z_\\5_\\6-(seq).grib2","title":"RAP CONUS 13 km (RAP13)"},{"location":"edex/data-grids/#rap-conus-20-km-rap20","text":"CONDUIT ^data/nccf/com/rap.*awp252.*!grib2/ncep/RUC2/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/RAP20/RR_CONUS_20km_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2","title":"RAP CONUS 20 km (RAP20)"},{"location":"edex/data-grids/#rap-conus-40-km-rap40-noaaport","text":"HDS ^[YZ].W.{1,3} KWBG ......[^!]*!grib/ncep/RUC2/#236/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RAP40/noaaport/RR_CONUS_40km_noaaport_\\1_\\2Z_\\3_\\4-(seq).grib","title":"RAP CONUS 40 km (RAP40) - NOAAport"},{"location":"edex/data-grids/#rap-conus-40-km-rap40-condiut","text":"CONDUIT ^data/nccf/com/rap.*awp236.*!grib2/ncep/RUC2/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*)/.*! (......) FILE -edex -log /awips2/data_store/grid/RAP40/conduit/RR_CONUS_40km_conduit_\\1_\\2Z_\\3_\\4-(seq).\\5.grib2","title":"RAP CONUS 40 km (RAP40) - CONDIUT"},{"location":"edex/data-grids/#hrrr","text":"","title":"HRRR"},{"location":"edex/data-grids/#hrrr-noaaport-1hr","text":"NGRID Y.C.[0-9][0-9] KWBY ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -edex -log /awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_\\1_F\\2_\\3-(seq).grib2","title":"HRRR - NOAAport 1hr"},{"location":"edex/data-grids/#hrrrx-experimental-gsd-hourly-and-sub-hourly","text":"FSL2 ^GRIB2\\.FSL\\.HRRR\\.(.......)_Lambert\\.(.*)(Minute|Hour)\\.(.*)\\.(.*)\\.([0-9]{12}).* FILE -edex -log /awips2/data_store/grid/HRRRX/HRRRX_CONUS_3km_\\6-\\1_\\3\\2_\\4_\\5-(seq).grib2","title":"HRRRX - Experimental GSD Hourly and Sub-Hourly"},{"location":"edex/data-grids/#shef","text":"","title":"SHEF"},{"location":"edex/data-grids/#sref-conus-40-km-ensemble-derived-products","text":"NGRID ^[LM].R... KWBL ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -edex -log /awips2/data_store/grid/SREF212/noaaport/SREF_CONUS_40km_ensprod_(\\1:yyyy)(\\1:mm)\\1_\\2.grib2","title":"SREF CONUS 40 km Ensemble Derived Products"},{"location":"edex/data-grids/#sref-conus-40-km-bias-corrected-ensemble-derived-products","text":"CONDUIT ^data/nccf/com/sref/prod/sref\\........./../(ensprod_biasc)/.*pgrb212.*!grib2/ncep/.*/#000/(............)(F...)/(.*)/.*! (......) FILE -edex -log /awips2/data_store/grid/SREF212/conduit/SREF_CONUS_40km_\\1_\\2_\\3_\\4_\\5-(seq).grib2","title":"SREF CONUS 40 km Bias Corrected Ensemble Derived Products"},{"location":"edex/data-grids/#sref-conus-40-km-bias-corrected-ensemble-members","text":"CONDUIT ^data/nccf/com/sref/prod/sref\\.(........)/(..)/(pgrb_biasc)/.*pgrb212.*!grib2/ncep/.*/#000/............(F...)/(.*)/.*! (......) FILE -edex -log /awips2/data_store/grid/SREF212/conduit/SREF_CONUS_40km_\\3_\\1_\\200_\\4_\\5_\\6.grib2","title":"SREF CONUS 40 km Bias Corrected Ensemble Members"},{"location":"edex/data-grids/#sref-alaska-45-km-ensemble-derived-products","text":"NGRID ^[LM].V... KWBL .......*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/AK-SREF45/SREF_Alaska_45km_ensprod_\\1\\2_\\3_\\4-(seq).grib2","title":"SREF Alaska 45 km Ensemble Derived Products"},{"location":"edex/data-grids/#sref-pacific-northeast-04-degree-ensemble-derived-products","text":"NGRID ^[LM].X... KWBL .......*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/EPac-SREF/SREF_PacificNE_0p4_ensprod_\\1\\2_\\3_\\4-(seq).grib2","title":"SREF Pacific Northeast 0.4 degree Ensemble Derived Products"},{"location":"edex/data-grids/#fnmoc","text":"","title":"FNMOC"},{"location":"edex/data-grids/#navy-coupled-ocean-data-assimilation-ncoda-global-025-degree","text":"FNMOC ^US058.*0078_0200_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-NCODA/fnmoc_NCODA_Global_Ocean_\\1_\\2_\\3_\\4.grib","title":"Navy Coupled Ocean Data Assimilation (NCODA) - Global 0.25 degree"},{"location":"edex/data-grids/#navy-global-environmental-model-navgem-05-degree","text":"FNMOC ^US058.*0018_0056_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-NAVGEM/fnmoc_NAVGEM_Global_0p5deg_\\1_\\2_\\3_\\4.grib","title":"NAVy Global Environmental Model (NAVGEM) - 0.5 degree"},{"location":"edex/data-grids/#forecast-of-aerosol-radiative-optical-properties-farop-global-10-degree","text":"FNMOC ^US058.*0135_0240_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-FAROP/fnmoc_FAROP_Global_1p0deg_\\1_\\2_\\3_\\4.grib","title":"Forecast of Aerosol Radiative Optical Properties (FAROP) - Global 1.0 degree"},{"location":"edex/data-grids/#wavewatch-iii-ww3-global-10-degree","text":"FNMOC ^US058.*0110_0240_(.*)_(.*)_(.*)-(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-WW3-Global_1p0deg/fnmoc_WW3_Global_1p0deg_\\1_\\2_\\3_\\4.grib","title":"WAVEWATCH III (WW3) - Global 1.0 degree"},{"location":"edex/data-grids/#ww3-europe","text":"FNMOC ^US058.{4}-GR1dyn\\.WW3-EURO_EURO-..-.._.{5}.{4}(.{4})(..)(..)(..)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-WW3-Europe/fnmoc_WW3_Europe_\\1\\2\\3_\\400_\\5.grib","title":"WW3 Europe"},{"location":"edex/data-grids/#coupled-oceanatmospheric-mesoscale-prediction-system-coamps-western-atlantic","text":"FNMOC ^US058.{4}-GR1dyn\\.COAMPS-NWATL_.{5}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Western_Atlantic/fnmoc_COAMPS_Western_Atlantic_\\1\\2\\3-(seq).grib","title":"Coupled Ocean/Atmospheric Mesoscale Prediction System (COAMPS) - Western Atlantic"},{"location":"edex/data-grids/#coamps-europe","text":"FNMOC ^US058.{4}-GR1dyn\\.COAMPS-EURO_.{4}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Europe/fnmoc_COAMPS_Europe_\\1\\2\\3-(seq).grib","title":"COAMPS Europe"},{"location":"edex/data-grids/#coamps-equatorial-america","text":"FNMOC ^US058.{4}-GR1dyn\\.COAMPS-EQAM_.{4}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Equatorial_America/fnmoc_COAMPS_Equatorial_America_\\1\\2\\3-(seq).grib","title":"COAMPS Equatorial America"},{"location":"edex/data-grids/#coamps-northeast-pacific","text":"FNMOC ^US058.{4}-GR1dyn\\.COAMPS-NEPAC_.{5}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Northeast_Pacific/fnmoc_COAMPS_Northeast_Pacific_\\1\\2\\3-(seq).grib","title":"COAMPS Northeast Pacific"},{"location":"edex/data-grids/#coamps-southern-california","text":"FNMOC ^US058.{4}-GR1dyn\\.COAMPS-SOCAL_.{5}-.{2}-.{2}_(.{9})(..........)(.*) FILE -edex -log /awips2/data_store/grid/FNMOC-COAMPS-Southern_California/fnmoc_COAMPS_Southern_California_\\1\\2\\3-(seq).grib","title":"COAMPS Southern California"},{"location":"edex/data-grids/#multi-radar-multi-sensor-mrms-noaaport","text":"","title":"Multi-Radar Multi-Sensor (MRMS) - NOAAport"},{"location":"edex/data-grids/#full-feed","text":"NGRID ^YAU[CDLMPQS].. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS/MRMS_\\1_\\2_\\3_\\4.grib2","title":"Full Feed"},{"location":"edex/data-grids/#mrms-precipitation-products","text":"NGRID ^YAU[DP].. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-precip/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Precipitation Products"},{"location":"edex/data-grids/#mrms-model-parameters-on-different-grid","text":"NGRID ^YAUM.. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-model/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Model Parameters (on different grid)"},{"location":"edex/data-grids/#mrms-lightning-products-from-nldn","text":"NGRID ^YAUL.. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-lightning/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Lightning Products from NLDN"},{"location":"edex/data-grids/#mrms-rotation-track-products-on-different-grid","text":"NGRID ^YAUS0[0-4] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-rotation/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Rotation Track Products (on different grid)"},{"location":"edex/data-grids/#mrms-mid-level-rotation-track-products-on-different-grid","text":"NGRID ^YAUS0[5-9] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-rotation-ml/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Mid-level Rotation Track Products (on different grid)"},{"location":"edex/data-grids/#mrms-merged-base-reflectivity","text":"NGRID ^YAUQ.. KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-merged/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Merged Base Reflectivity"},{"location":"edex/data-grids/#mrms-radar-products","text":"NGRID ^YAU(C[0-9]|S[1-9])[0-9] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-radar/MRMS_\\2_\\3_\\4_\\5.grib2","title":"MRMS Radar Products"},{"location":"edex/data-grids/#mrms-anything-else-mainly-future-proofing","text":"NGRID ^YAU[ABE-KNORT-Z][0-9][0-9] KWNR ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.* FILE -edex -log /awips2/data_store/grid/MRMS-other/MRMS_\\1_\\2_\\3_\\4.grib2","title":"MRMS Anything else (mainly future proofing)"},{"location":"edex/data-grids/#ecmwf","text":"","title":"ECMWF"},{"location":"edex/data-grids/#ecmf-global-ecmf1ecmf12","text":"HDS ^H..... ECM. ......[^!]*!grib.*/[^/]*/[^/]*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ECMWF/ECMWF-\\1_\\2_\\3_\\4_\\5-(seq).grib","title":"ECMF-Global, ECMF1..ECMF12"},{"location":"edex/data-grids/#other","text":"","title":"Other"},{"location":"edex/data-grids/#canadian-gem-regional-model-cmc","text":"CMC ^CMC_reg_(.*)km_(..........)_P(...).grib2 FILE -edex -log /awips2/data_store/grid/CMC/CMC_reg_\\1km_\\2_P\\3.grib2","title":"Canadian GEM Regional Model (CMC)"},{"location":"edex/data-grids/#ukmet-global","text":"HDS ^H..... EGRR ......[^!]*!grib/ukmet/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/UKMET-\\1-GRID\\2/UKMET-\\1-GRID\\2_\\3\\4_\\5_\\6-(seq).grib1","title":"UKMET Global"},{"location":"edex/data-grids/#national-precipitation-verification-unit-npvu-rfc-multisensor-precipitation-estimates-mpe-mpe-local-mpe-mosaic-","text":"HDS ^ZETA98 (....) ([0-3][0-9])([0-2][0-9]).*/m(.......) FILE -edex -log /awips2/data_store/grid/MPE-\\1/MPE-\\1_\\4_(\\2:yyyy)(\\2:mm)\\2.grib","title":"National Precipitation Verification Unit (NPVU) - RFC Multisensor Precipitation Estimates (MPE) (MPE-Local-..., MPE-Mosaic-...)"},{"location":"edex/data-grids/#automated-satellite-precipitation-estimates-from-nesdis-hourly-autospe","text":"HDS ^ZETA98 K[NW][EN][ES] ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/([^/]*)/([^/]*) FILE -edex -log /awips2/data_store/grid/AUTOSPE/AUTOSPE-\\1\\2_\\3_\\4-(seq).grib2","title":"Automated Satellite Precipitation Estimates from NESDIS (hourly) (AUTOSPE)"},{"location":"edex/data-grids/#river-forecast-center-rfc-quantitative-precipitation-estimation-qpe","text":"HDS ^ZETA98 (KTUA|PACR|KSTR|KRSA|KORN|KRHA|KKRF|KMSR|KTAR|KPTR|KTIR|KALR|KFWR) ......[^!]*!(grib|grib2)/[^/]*/([^/]*)/#255/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/QPE-RFC/QPE-RFC-\\1_\\3_\\4_\\5_\\6_\\7-(seq).grib","title":"River Forecast Center (RFC) Quantitative Precipitation Estimation (QPE)"},{"location":"edex/data-grids/#ocean-sea-surface-temperature-sst-grids-61-64","text":"HDS ^H.[T-W] FILE -edex -log /awips2/data_store/grid/SST/%Y%m%d%H%M.sst.grib","title":"Ocean Sea Surface Temperature (SST) Grids #61-64"},{"location":"edex/data-grids/#hpcguide","text":"NGRID ^([LM][ABCDFGH]U...) (KWBN) (..)(..)(..)[^!]*!(grib|grib2)/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCGuide/GRID\\8/HPCGuide_GRID\\8_\\(10)Z_\\(11)_\\(12)-\\1_\\2_\\3\\4\\5-(seq).grib2","title":"HPCGuide"},{"location":"edex/data-grids/#national-convective-weather-forecast-ncwf","text":"ANY ^ZDIA98 (....) ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3}) FILE -edex -log /awips2/data_store/grid/NCWF/NCWF_\\2_\\3_\\4_\\1-(seq).grib","title":"National Convective Weather Forecast (NCWF)"},{"location":"edex/data-grids/#national-operational-hydrologic-remote-sensing-center-snow-analysis-nohrsc-snow","text":"HDS ^[YZ][ES]QA88 KMSR ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/NOHRSC-SNOW/NOHRSC-SNOW_\\1_\\2_\\3_\\4-(seq).grib","title":"National Operational Hydrologic Remote Sensing Center Snow Analysis (NOHRSC-SNOW)"},{"location":"edex/data-grids/#gfs-mos-based-localized-aviation-mos-program-lamp-guidance-lamp2p5-gfslamp5","text":"NGRID ^([LM].[ABDHMNRSTU].{1,3}) (KWNO|KMDL) (..)(..)(..)[^!]*!(grib2)/[^/]*/(LAMP)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/LAMP/\\7_\\9_\\(10)Z_\\(11)_\\(12)-\\1_\\2-(seq).grib2","title":"GFS MOS-Based Localized Aviation MOS Program (LAMP) guidance - LAMP2p5, GFSLAMP5"},{"location":"edex/data-grids/#radar-coded-messages-rcm","text":"HDS ^HAXA00 KWBC ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RCM/RCM_\\1_\\2_\\3_\\4-(seq).grib","title":"Radar Coded Messages (RCM)"},{"location":"edex/data-grids/#national-digital-forecast-database-ndfd","text":"CONDUIT grib2/nwstg/NWS_0/..../(........)(....)(F...)/(.*)/.*! (......) FILE -edex -log /awips2/data_store/grid/NDFD/NDFD_\\1\\2\\3_\\4-\\5.grib2","title":"National Digital Forecast Database (NDFD)"},{"location":"edex/data-grids/#ndfd-wpc-quantitative-precipitation-forecast-hpcqpfndfd","text":"NGRID ^[LM].[MN].98 KWNH ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCqpf-ngrid/HPCqpf_\\1_\\2_\\3_\\4-(seq).grib2","title":"NDFD WPC Quantitative Precipitation Forecast (HPCqpfNDFD)"},{"location":"edex/data-grids/#ndfd-wpc-day-1-3-excessive-rainfall-outlook-hpcqpfndfd","text":"HDS ^[LM].[MN].98 KWNH ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCqpf-hds/HPCqpf_\\1_\\2_\\3_\\4-(seq).grib","title":"NDFD WPC Day 1-3 Excessive Rainfall Outlook (HPCqpfNDFD)"},{"location":"edex/data-grids/#mosguidemosguideextendedgmos","text":"NGRID ^(Y.UZ9[0-9]) (KWB.) (..)(..)(..) FILE -edex -log /awips2/data_store/grid/MOSGuide/MOSGuide_\\1_\\2_\\3\\4\\5-(seq).grib2","title":"MOSGuide/MOSGuideExtended/GMOS"},{"location":"edex/data-grids/#flash-flood-guidance-ffg-grids-1hrhpbl-3hr5wavh-6hrcnwat-ffg-ptr","text":"HDS ^ZEGZ98 K.{3} .......*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/FFG/FFG-\\1_\\2_\\3_\\4_\\5-(seq).grib","title":"Flash Flood Guidance (FFG) grids - 1HR=HPBL, 3HR=5WAVH, 6HR=CNWAT (FFG-PTR...)"},{"location":"edex/data-grids/#prob3hr236","text":"HDS ^Z[DE]W[A-D][89]8 KWNO ...... /m0 !grib.*/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/PROB3HR/PROB3HR_\\1_\\2_\\3_\\4-(seq).grib","title":"PROB3HR/#236"},{"location":"edex/data-grids/#national-data-buoy-center-ndbc-high-frequency-radar-hfr-total-vector-velocity-tvv","text":"NGRID ^OUTA98 KWNB (......)[^!]*!grib2 FILE -edex -log /awips2/data_store/grid/HFR/HFR_\\1-(seq).grib2","title":"National Data Buoy Center (NDBC) High Frequency Radar (HFR) Total Vector Velocity (TVV)"},{"location":"edex/data-grids/#regional-river-forecast-cebter-rfc-quantitative-precipitation-forecast-qpf-rfcqpf","text":"HDS ^YEI.[89]8 (KALR|KFWR|KKRF|KMSR|KORN|KPTR|KRHA|KRSA|KSTR|KTAR|KTIR|KTUA) .......*/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/[^/]*/([^/]*) FILE -edex -log /awips2/data_store/grid/Regional_RFC_QPF/GRID\\2/\\3_\\4_\\5_\\6-(seq).grib","title":"Regional River Forecast Cebter (RFC) Quantitative Precipitation Forecast (QPF) (RFCqpf)"},{"location":"edex/data-grids/#grid218-hpcqpf","text":"HDS ^(ZEX.98) KWNH (..)(..)(..)[^!]*!(grib|grib2)/[^/]*/([^/]*)/#218/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/HPCqpf/HPCqpf_\\9Z_\\(10)_\\(11)-\\1_KWNH_\\2\\3\\4-(seq).grib","title":"GRID218 = HPCqpf"},{"location":"edex/data-grids/#regional-river-forecast-cebter-rfc-quantitative-precipitation-forecast-qpf","text":"HDS ^(YEI.[89]8) KWNH (..)(..)(..)[^!]*!(grib|grib2)/ncep/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})/(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/RFC_QPF_GRID\\7/\\9Z_\\(10)_\\(11)-\\1_KWNH_\\2\\3\\4-(seq).grib","title":"Regional River Forecast Cebter (RFC) Quantitative Precipitation Forecast (QPF)"},{"location":"edex/data-grids/#ocean-models","text":"","title":"Ocean Models"},{"location":"edex/data-grids/#wave-233-grid-global-noaa-wavewatch-iii-wavewatch","text":"HDS ^O.J.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-WaveWatch/WaveWatch_\\1_F\\2_\\3_\\4_%H%M%S.grib","title":"WAVE 233 Grid - Global NOAA WAVEWATCH III (WaveWatch)"},{"location":"edex/data-grids/#wave-238-grid-regional-western-north-atlantic-wave-wna","text":"HDS ^O.M.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-WNAWAVE238/WNAWAVE238_\\1_F\\2_\\3_\\4_%H%M%S.grib","title":"WAVE 238 Grid - Regional Western North Atlantic (WAVE-WNA)"},{"location":"edex/data-grids/#wave-238-grid-regional-western-north-atlantic-hurricane-wave-wna-hurr","text":"HDS ^O.O.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-HurWave238/HurWave238_\\1_F\\2_\\3_\\4_%H%M%S.grib","title":"WAVE 238 Grid - Regional Western North Atlantic Hurricane (WAVE-WNA-Hurr)"},{"location":"edex/data-grids/#wave-239-grid-regional-alaska-waters-wave-ak","text":"HDS ^O.N.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-AKWAVE239/AKWAVE239_\\1_F\\2_\\3_\\4_%H%M%S.grib","title":"WAVE 239 Grid - Regional Alaska Waters (WAVE-AK)"},{"location":"edex/data-grids/#wave-253-grid-regional-eastern-north-pacific-wave-enp","text":"HDS ^O.S.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-ENPWAVE253/ENPWAVE253_\\1_F\\2_\\3_\\4_%H%M%S.grib","title":"WAVE 253 Grid - Regional Eastern North Pacific (WAVE-ENP)"},{"location":"edex/data-grids/#wave-253-grid-regional-eastern-north-pacific-hurricane-wave-enp-hurr","text":"HDS ^O.Q.88 KWBJ.*!grib/ncep/[^/]*/#[^/]*/(............)/F(...)/(.*)/(.*)/ FILE -edex -log /awips2/data_store/grid/WAVE-HDS-HurWave253/HurWave253\\1_F\\2_\\3_\\4_%H%M%S.grib","title":"WAVE 253 Grid - Regional Eastern North Pacific Hurricane (WAVE-ENP-Hurr)"},{"location":"edex/data-grids/#ww3-global","text":"NGRID E.A.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Global/WW3_Global_\\1_\\200.grib2","title":"WW3 Global"},{"location":"edex/data-grids/#ww3-regional-alaska","text":"NGRID E.E.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_Alaska/WW3_Regional_Alaska_\\1_\\200.grib2","title":"WW3 Regional Alaska"},{"location":"edex/data-grids/#ww3-coastal-alaska","text":"NGRID E.F.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Coastal_Alaska/WW3_Coastal_Alaska_\\1_\\200.grib2","title":"WW3 Coastal Alaska"},{"location":"edex/data-grids/#ww3-eastern-pacific-regional","text":"NGRID E.D.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_Eastern_Pacific/WW3_Regional_Eastern_Pacific_\\1_\\200.grib2","title":"WW3 Eastern Pacific (Regional)"},{"location":"edex/data-grids/#ww3-us-east-coast-regional","text":"NGRID E.B.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_US_East_Coast/WW3_Regional_US_East_Coast_\\1_\\200.grib2","title":"WW3 US East Coast (Regional)"},{"location":"edex/data-grids/#ww3-us-east-coast-coastal","text":"NGRID E.H.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Coastal_US_East_Coast/WW3_Coastal_US_East_Coast_\\1_\\200.grib2","title":"WW3 US East Coast (Coastal)"},{"location":"edex/data-grids/#ww3-us-west-coast-regional","text":"NGRID E.C.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Regional_US_West_Coast/WW3_Regional_US_West_Coast_\\1_\\200.grib2","title":"WW3 US West Coast (Regional)"},{"location":"edex/data-grids/#ww3-us-west-coast-coastal","text":"NGRID E.G.88 KWBJ.*ncep/[^/]*/#[^/]*/(............)F(...)/.* FILE -edex -log /awips2/data_store/grid/WW3_Coastal_US_West_Coast/WW3_Coastal_US_West_Coast_\\1_\\200.grib2","title":"WW3 US West Coast (Coastal)"},{"location":"edex/data-grids/#estofs-us","text":"NGRID ^E[EHC]I[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS/ESTOFS_\\1_\\2_\\3_\\4-(seq).grib","title":"ESTOFS - US"},{"location":"edex/data-grids/#estofs-puerto-rico","text":"NGRID ^E[EHC]P[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-PR/ESTOFS-PR_\\1_\\2_\\3_\\4-(seq).grib","title":"ESTOFS - Puerto Rico"},{"location":"edex/data-grids/#estofs-pacific-alaska","text":"NGRID ^E[EHC]A[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-AK/ESTOFS-AK_\\1_\\2_\\3_\\4-(seq).grib","title":"ESTOFS Pacific - Alaska"},{"location":"edex/data-grids/#estofs-pacific-conus-west-coast","text":"NGRID ^E[EHC]D[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-WC/ESTOFS-WC_\\1_\\2_\\3_\\4-(seq).grib","title":"ESTOFS Pacific - CONUS (West Coast)"},{"location":"edex/data-grids/#estofs-pacific-hawaii","text":"NGRID ^E[EHC]H[A-Z]88 KWBM ......[^!]*!grib.*/[^/]*/[^/]*/#[^/]*/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -edex -log /awips2/data_store/grid/ESTOFS-HI/ESTOFS-HI_\\1_\\2_\\3_\\4-(seq).grib","title":"ESTOFS Pacific - Hawaii"},{"location":"edex/data-grids/#extra-tropical-storm-surge-etss","text":"NGRID ^MHU... KNHC (..)(..)(..) FILE -edex -log /awips2/data_store/grid/ETSS/ETSS_\\1\\2\\3-(seq).grib2","title":"Extra-Tropical Storm Surge (ETSS)"},{"location":"edex/data-grids/#glerl","text":"HDS ^O.N.88 KWNB.*!grib/161/([^/]*)/#([^/]*)/(............)/F(...)/.* FILE -edex -log /awips2/data_store/grid/GLERL/GLERL_\\1_F\\2_%H%M%S.grib","title":"GLERL"},{"location":"edex/data-grids/#important-files-and-directories","text":"|---|---| | location on disk | /awips2/edex/data/hdf5/grid | | definition files | /awips2/edex/data/utility/edex_static/base/grib/models | | navigation files | /awips2/edex/data/utility/edex_static/base/grib/grids | | grib1 definitions | /awips2/edex/data/utility/common_static/base/grid | | D2D files | /awips2/edex/data/utility/edex_static/base/grib/grids | | metadata tables | grid | | | grid_info | | | gridcoverage |","title":"Important Files and Directories"},{"location":"edex/data-plugins/","text":"td:first-child { font-weight: bold } AWIPS Plugins and Supported Data Types \uf0c1 NAME DESCRIPTION aqi Air Quality Index data bufrmos Model Output Statistics bufrua Upper air radiosonde data climate-hmdb Climate text products geodata NetCDF JTS Geometry records geomag SWPC Geomagnetic Forecast (RTKP) gfe Graphical Forecast Editor grids ghcd SWPC Generic High Cadence Data gpd NCEP Generic Point Data grid Binary gridded data grib1/grib2 idft Ice Drift Forecasts madis NCEP Meteorological Assimilation Data Ingest System ( MADIS ) manualIngest Manual data ingest plugin metartohmdb Adds metar records to the Verification and Climate database modelsounding Individual grid point soundings from the GFS and NAM models mping Meteorological Phenomena Identification Near the Ground ( mPING ) ncpafm Point/Area Forecast Matrices data nctext NCEP Text decoders ncuair NCEP Upper Air decoder ndm National Dataset Maintenance ingester ntrans NCCEP Ntrans Metafiles obs Surface observations from METARs pgen NCEP NAWIPS PGEN decoder redbook Redbook graphics sfcobs Surface observations other than METAR format including buoys solarimage SWPC Solar imagery ssha NCEP Sea Surface Height Anomaly BUFR data text Various Text Products vaa Volcanic ash advisories AWIPS Plugins for Remote Sensing/Lightning \uf0c1 NAME DESCRIPTION binlightning Lightning data from the National Lightning Detection Network bufrascat Advanced Scatterometer wind data bufrhdw GOES High Density Winds bufrmthdw MTSAT (Japanese Multi-Functional Transport Satellite) High Density Winds bufrssmi Special Sensor Microwave/Imager data from DMSP (Defesne Meteorological Satellite Program) satellites crimss NPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings dmw GOES-R Derived Motion Winds glm GOES Geostationary Lightning Mapper goesr Plugins to decode and display GOES-R products goessounding GOES Satellite Soundings lma Lightning Mapping Array mcidas NCEP decoder for McIDAS AREA files modis NASA Moderate-resolution Imaging Spectroradiometer ncscat NCEP ASCAT/Quikscat records npp National Polar-Orbiting Partnership Satellites Soundings nucaps Soundings from NOAA Unique CrIS/ATMS Processing System from NPP (National Polar-Orbiting Partnership) Satellites poessounding Polar Operational Environmental Satellite soundings radar WSR-88D and TDWR Level 3 data regionalsat Decoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground satellite-gini GINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD) satellite-mcidas McIDAS area files (Raytheon/D2D-developed) viirs NPP Visible Infrared Imaging Radiometer Suite data sgwh NCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), SGWH2 (Jason-2), or Jason-3 textlightning Text lightning data AWIPS Plugins for Decision Assistance (Watch/Warn/Hazards/Hydro) \uf0c1 NAME DESCRIPTION atcf Automated Tropical Cyclone Forecast convectprob NOAA/CIMSS Prob Severe Model editedregions Hazard Services Edited Regions editedevents Hazard Services Edited Events cwat County Warning Area Threat produced by SCAN (System for Convection Analysis and Nowcasting). CWAT was formerly called SCAN Convective Threat Index (SCTI). ffg Flash flood guidance metadata (countybased ffg from RFCs) ffmp Flash Flood Monitoring and Prediction data (raw data inputs: radar, gridded flash flood guidance from River Forecast Centers, highresolution precipitation estimates [HPE] and nowcasts [HPN], QPF from SCAN and gage data from the IHFS [Integrated Hydrologic Forecast System] database. Radar data [with WSR-88D product mnemonics and numbers] needed for FFMP are Digital Hybrid Reflectivity [DHR, 32] and Digital Precipitation Rate [DPR, 176]. The raw GRIB files containing RFC Flash Flood Guidance are identified in the tables in Part 2 of this document as NWS_151 or FFG-XXX, where XXX is an RFC identifier such as TUA, KRF, or ALR. The WMO header for the RFC FFG begins with \u201cZEGZ98\u201d. ) fog Fog Monitor . Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs, and satellite [visible, 3.9 \u00b5m, and 10.7 \u00b5m]) freezingLevel MPE Rapid Refresh Freezing Level scheduled process (MpeRUCFreezingLevel) fssobs Observations for the Fog monitor, SNOW, and SAFESEAS (raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs). lsr Local Storm Reports mpe Multi-sensor Precipitation Estimation preciprate Precipitation Rate from SCAN. Raw data input: radar data [with WSR-88D product mnemonic and number] needed for preciprate are Digital Hybrid Reflectivity [DHR, 32]. qpf Quantitative Precipitation Forecast from SCAN. (raw data inputs: radar and some RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for SCAN\u2019s QPF are 0.5 degree Base Reflectivity [Z, 19], 4 km Vertically Integrated Liquid [VIL, 57], and Storm Track [STI, 58]. The RAP13 field needed is 700 mb Wind, as defined in the SCANRunSiteConfig.xml file.) satpre Satellite-estimated Pecipiration (hydroApps) scan SCAN (System for Convection Analysis and Nowcasting). (Inputs for the SCAN Table include radar, cloud-to-ground lightning from the NLDN, fields from RAP13, and CWAT. Specific radar products [with WSR-88D product mnemonics and numbers] are: 1 km Composite Reflectivity [CZ, 37]; 0.5 degree Base Reflectivity [Z, 19]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signature [TVS, 61]. shef Standard Hydrometeorological Exchange Format data. warning Watches, Warnings, and Advisories wcp SPC Convective Watches svrwx SPC Local Storm Report Summaries tcg Tropical Cyclone Guidance tcm Tropical Cyclone Forecast/Advisory tcs Tropical Cyclone Forecast/Advisory stormtrack NCEP StormTrack Plug-In (Automatic Tropical Cyclone Forecast & Ensemble cyclones) vil Cell-based Vertically Integrated Liquid from SCAN (Input is radar) spc Storm Prediction Center Convective Outlook KML files AWIPS Plugins for Aviation \uf0c1 NAME DESCRIPTION acars Aircraft Communications Addressing and Reporting System (ACARS) observations acarssounding Vertical profiles derived from ACARS data airep Automated Aircraft Reports airmet \u201cAirmen\u2019s Meteorological Information\u201d: aviation weather advisories for potentially hazardous, but non-severe weather asdi FAA Aircraft Situation Data for Industry aww Airport Weather Warning bufrncwf National Convective Weather Forecast for Aviation bufrsigwx Aviation Significant Weather ccfp Aviation Collaborative Convective Forecast Product convsigmet Aviation Significant Meteorological Information for convective weather cwa Aviation Center Weather Advisory, issued by CWSUs (Center Weather Service Units) intlsigmet International Significant Meteorological Information for Aviation nctaf NCEP TAF decoders nonconvsigmet Aviation Significant Meteorological Information for non-convective weather pirep Pilot Reports taf Terminal Aerodrome Forecasts","title":"Data Plugins"},{"location":"edex/data-plugins/#awips-plugins-and-supported-data-types","text":"NAME DESCRIPTION aqi Air Quality Index data bufrmos Model Output Statistics bufrua Upper air radiosonde data climate-hmdb Climate text products geodata NetCDF JTS Geometry records geomag SWPC Geomagnetic Forecast (RTKP) gfe Graphical Forecast Editor grids ghcd SWPC Generic High Cadence Data gpd NCEP Generic Point Data grid Binary gridded data grib1/grib2 idft Ice Drift Forecasts madis NCEP Meteorological Assimilation Data Ingest System ( MADIS ) manualIngest Manual data ingest plugin metartohmdb Adds metar records to the Verification and Climate database modelsounding Individual grid point soundings from the GFS and NAM models mping Meteorological Phenomena Identification Near the Ground ( mPING ) ncpafm Point/Area Forecast Matrices data nctext NCEP Text decoders ncuair NCEP Upper Air decoder ndm National Dataset Maintenance ingester ntrans NCCEP Ntrans Metafiles obs Surface observations from METARs pgen NCEP NAWIPS PGEN decoder redbook Redbook graphics sfcobs Surface observations other than METAR format including buoys solarimage SWPC Solar imagery ssha NCEP Sea Surface Height Anomaly BUFR data text Various Text Products vaa Volcanic ash advisories","title":"AWIPS Plugins and Supported Data Types"},{"location":"edex/data-plugins/#awips-plugins-for-remote-sensinglightning","text":"NAME DESCRIPTION binlightning Lightning data from the National Lightning Detection Network bufrascat Advanced Scatterometer wind data bufrhdw GOES High Density Winds bufrmthdw MTSAT (Japanese Multi-Functional Transport Satellite) High Density Winds bufrssmi Special Sensor Microwave/Imager data from DMSP (Defesne Meteorological Satellite Program) satellites crimss NPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings dmw GOES-R Derived Motion Winds glm GOES Geostationary Lightning Mapper goesr Plugins to decode and display GOES-R products goessounding GOES Satellite Soundings lma Lightning Mapping Array mcidas NCEP decoder for McIDAS AREA files modis NASA Moderate-resolution Imaging Spectroradiometer ncscat NCEP ASCAT/Quikscat records npp National Polar-Orbiting Partnership Satellites Soundings nucaps Soundings from NOAA Unique CrIS/ATMS Processing System from NPP (National Polar-Orbiting Partnership) Satellites poessounding Polar Operational Environmental Satellite soundings radar WSR-88D and TDWR Level 3 data regionalsat Decoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground satellite-gini GINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD) satellite-mcidas McIDAS area files (Raytheon/D2D-developed) viirs NPP Visible Infrared Imaging Radiometer Suite data sgwh NCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), SGWH2 (Jason-2), or Jason-3 textlightning Text lightning data","title":"AWIPS Plugins for Remote Sensing/Lightning"},{"location":"edex/data-plugins/#awips-plugins-for-decision-assistance-watchwarnhazardshydro","text":"NAME DESCRIPTION atcf Automated Tropical Cyclone Forecast convectprob NOAA/CIMSS Prob Severe Model editedregions Hazard Services Edited Regions editedevents Hazard Services Edited Events cwat County Warning Area Threat produced by SCAN (System for Convection Analysis and Nowcasting). CWAT was formerly called SCAN Convective Threat Index (SCTI). ffg Flash flood guidance metadata (countybased ffg from RFCs) ffmp Flash Flood Monitoring and Prediction data (raw data inputs: radar, gridded flash flood guidance from River Forecast Centers, highresolution precipitation estimates [HPE] and nowcasts [HPN], QPF from SCAN and gage data from the IHFS [Integrated Hydrologic Forecast System] database. Radar data [with WSR-88D product mnemonics and numbers] needed for FFMP are Digital Hybrid Reflectivity [DHR, 32] and Digital Precipitation Rate [DPR, 176]. The raw GRIB files containing RFC Flash Flood Guidance are identified in the tables in Part 2 of this document as NWS_151 or FFG-XXX, where XXX is an RFC identifier such as TUA, KRF, or ALR. The WMO header for the RFC FFG begins with \u201cZEGZ98\u201d. ) fog Fog Monitor . Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs, and satellite [visible, 3.9 \u00b5m, and 10.7 \u00b5m]) freezingLevel MPE Rapid Refresh Freezing Level scheduled process (MpeRUCFreezingLevel) fssobs Observations for the Fog monitor, SNOW, and SAFESEAS (raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs). lsr Local Storm Reports mpe Multi-sensor Precipitation Estimation preciprate Precipitation Rate from SCAN. Raw data input: radar data [with WSR-88D product mnemonic and number] needed for preciprate are Digital Hybrid Reflectivity [DHR, 32]. qpf Quantitative Precipitation Forecast from SCAN. (raw data inputs: radar and some RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for SCAN\u2019s QPF are 0.5 degree Base Reflectivity [Z, 19], 4 km Vertically Integrated Liquid [VIL, 57], and Storm Track [STI, 58]. The RAP13 field needed is 700 mb Wind, as defined in the SCANRunSiteConfig.xml file.) satpre Satellite-estimated Pecipiration (hydroApps) scan SCAN (System for Convection Analysis and Nowcasting). (Inputs for the SCAN Table include radar, cloud-to-ground lightning from the NLDN, fields from RAP13, and CWAT. Specific radar products [with WSR-88D product mnemonics and numbers] are: 1 km Composite Reflectivity [CZ, 37]; 0.5 degree Base Reflectivity [Z, 19]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signature [TVS, 61]. shef Standard Hydrometeorological Exchange Format data. warning Watches, Warnings, and Advisories wcp SPC Convective Watches svrwx SPC Local Storm Report Summaries tcg Tropical Cyclone Guidance tcm Tropical Cyclone Forecast/Advisory tcs Tropical Cyclone Forecast/Advisory stormtrack NCEP StormTrack Plug-In (Automatic Tropical Cyclone Forecast & Ensemble cyclones) vil Cell-based Vertically Integrated Liquid from SCAN (Input is radar) spc Storm Prediction Center Convective Outlook KML files","title":"AWIPS Plugins for Decision Assistance (Watch/Warn/Hazards/Hydro)"},{"location":"edex/data-plugins/#awips-plugins-for-aviation","text":"NAME DESCRIPTION acars Aircraft Communications Addressing and Reporting System (ACARS) observations acarssounding Vertical profiles derived from ACARS data airep Automated Aircraft Reports airmet \u201cAirmen\u2019s Meteorological Information\u201d: aviation weather advisories for potentially hazardous, but non-severe weather asdi FAA Aircraft Situation Data for Industry aww Airport Weather Warning bufrncwf National Convective Weather Forecast for Aviation bufrsigwx Aviation Significant Weather ccfp Aviation Collaborative Convective Forecast Product convsigmet Aviation Significant Meteorological Information for convective weather cwa Aviation Center Weather Advisory, issued by CWSUs (Center Weather Service Units) intlsigmet International Significant Meteorological Information for Aviation nctaf NCEP TAF decoders nonconvsigmet Aviation Significant Meteorological Information for non-convective weather pirep Pilot Reports taf Terminal Aerodrome Forecasts","title":"AWIPS Plugins for Aviation"},{"location":"edex/data-purge/","text":"Purging and Retention \uf0c1 Purge Types \uf0c1 There are two main forms of data purging in AWIPS. The most often thought of is the purging for processed data . This has to do with how long data is stored for after it has been decoded and processed. The second type of purging has to do with raw data . This has to do with how long data is stored for before it has been decoded. Processed Data Purging \uf0c1 AWIPS uses a plugin-based purge strategy for processed HDF5 data . This allows the user to change the purge frequency for each plugin individually, and even set purge rules for specific products for a particular plugin. There is also a default purge rules file for those products which do not have specific rules written. Note : Purging is triggered by a quartz timer event that fires at 30 minutes after each hour. Purging rules are defined in XML files in the Localization Store. On EDEX, most are located in /awips2/edex/data/utility/common_static/base/purge , and follow the base/site localization pattern (e.g. site purge files are in site/XXX/purge rather than base/purge , where XXX is the site identifier). Each data set can have a purge rule defined, and the xml file is named after the data set: ls /awips2/edex/data/utility/common_static/base/purge/ acarsPurgeRules.xml bufruaPurgeRules.xml pirepPurgeRules.xml acarssoundingPurgeRules.xml ccfpPurgeRules.xml poessoundingPurgeRules.xml aggregatePurgeRules.xml convsigmetPurgeRules.xml pointsetPurgeRules.xml airepPurgeRules.xml cwaPurgeRules.xml profilerPurgeRules.xml ... Time-based purge \uf0c1 If a plugin has no XML file, the default rule of 1 day (24 hours) is used, from /awips2/edex/data/utility/common_static/base/purge/defaultPurgeRules.xml : 01-00:00:00 Time-based purging is set with the period tag and uses the reference time of the data. The reference time of the data is determined by the decoder. 30-day NEXRAD3 Example \uf0c1 Modify /awips2/edex/data/utility/common_static/base/purge/radarPurgeRules.xml to increase the data retention period from 1 to 31 days: 31-00:00:00 Note : you do NOT have to restart EDEX when you change a purge rule! Frame-Based Purge \uf0c1 Some plugins use frame-base purging, retaining and certain number of product \"versions\". /awips2/edex/data/utility/common_static/base/purge/gridPurgeRules.xml 2 07-00:00:00 LAPS 30 NAM(?:12|20|40) 2 00-00:15:00 ... In the above example, notice a default rule (2) is specified, as well as specific models with their own rules. The tag modTimeToWait can be used in conjunction with versionsToKeep and will increase the versionsToKeep by 1 if data matching this rule has been stored within modTimeToWait. Purge Logs \uf0c1 Data purge events are logged to the file edex-ingest-purge-[yyyymmdd].log , where [yyyymmdd] is the date stamp. tail -f edex-ingest-purge-20120327.log --------START LOG PURGE--------- INFO 2012-03-27 00:30:00,027 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped file with invalid fileName: afos-trigger.log INFO 2012-03-27 00:30:00,193 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Removed 1 old files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Archived 14 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped processing 1 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::---------END LOG PURGE----------- All Purge Rules \uf0c1 To see all purge rule directories (base, site, configured): find /awips2/edex/data/utility -name purge /awips2/edex/data/utility/common_static/base/purge If any overrides have been made, then it's possible that site directories may show up as results from the find command as well. Raw Data Purging \uf0c1 Raw data are files that have been brought in by the LDM and recognized by an action in the pqact.conf file. These files are written to subdirectories of /awips2/data_store/ . This data will wait here until it is purged, from the purging rules defined in /awips2/edex/data/utility/common_static/base/archiver/purger/RAW_DATA.xml . If the purge time is too short, and the processing latencies on EDEX are too long, it is possible that EDEX will miss some of this data, and the purge times will need to be adjusted by changing the or tag on the relevent data sets. Default Retention \uf0c1 The defaultRetentionHours tag is defined at the beginning of the RAW_DATA.xml file. It is the duration that will apply to any piece of data that does not fall under an explicitly defined category . The default value for our EDEX is 1 hour: Raw /awips2/data_store/ 1 ... Selected Retention \uf0c1 Data sets are broken up into categories in the RAW_DATA.xml file. These categories are groupings of similar data. Each category has a selectedRetentionHours tag which specifies how long the matching data will be kept for. For example, there is a Model category which sets the purge time to 3 hours for all grib, bufrmos, and modelsounding data: ... Model 3 (grib|grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*) {1} - {6} 2,3,4,5 (bufrmos|modelsounding)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2}) {1} 2,3,4,5 ... Logging \uf0c1 Raw data purging can be seen in the purge logs as well ( /awips2/edex/logs/edex-ingest-purge-[yyyymmdd].log where [yyyymmdd] is the date stamp). [centos@tg-atm160027-edex-dev purge]$ grep -i 'archive' /awips2/edex/logs/edex-ingest-purge-20200728.log INFO 2020-07-28 20:05:23,959 2329 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\". INFO 2020-07-28 20:05:23,960 2330 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\", deleted 0 files and directories. INFO 2020-07-28 20:05:23,961 2331 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/bufrhdw\" INFO 2020-07-28 20:05:23,963 2332 [Purge-Archive] ArchivePurgeManager: EDEX - Locked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,963 2333 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Products, directory \"/awips2/data_store/xml\". INFO 2020-07-28 20:05:23,964 2334 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Products, directory \"/awips2/data_store/xml\", deleted 5 files and directories. INFO 2020-07-28 20:05:23,967 2335 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,967 2336 [Purge-Archive] ArchivePurger: EDEX - Raw::Archive Purged 28387 files in 23.8s. INFO 2020-07-28 20:05:23,979 2337 [Purge-Archive] ArchivePurgeManager: EDEX - Purging directory: \"/awips2/edex/data/archive\". INFO 2020-07-28 20:05:23,992 2338 [Purge-Archive] ArchivePurger: EDEX - Processed::Archive Purged 0 files in 25ms. INFO 2020-07-28 20:05:23,992 2339 [Purge-Archive] ArchivePurger: EDEX - Archive Purge finished. Time to run: 23.9s ...","title":"Purging and Retention"},{"location":"edex/data-purge/#purging-and-retention","text":"","title":"Purging and Retention"},{"location":"edex/data-purge/#purge-types","text":"There are two main forms of data purging in AWIPS. The most often thought of is the purging for processed data . This has to do with how long data is stored for after it has been decoded and processed. The second type of purging has to do with raw data . This has to do with how long data is stored for before it has been decoded.","title":"Purge Types"},{"location":"edex/data-purge/#processed-data-purging","text":"AWIPS uses a plugin-based purge strategy for processed HDF5 data . This allows the user to change the purge frequency for each plugin individually, and even set purge rules for specific products for a particular plugin. There is also a default purge rules file for those products which do not have specific rules written. Note : Purging is triggered by a quartz timer event that fires at 30 minutes after each hour. Purging rules are defined in XML files in the Localization Store. On EDEX, most are located in /awips2/edex/data/utility/common_static/base/purge , and follow the base/site localization pattern (e.g. site purge files are in site/XXX/purge rather than base/purge , where XXX is the site identifier). Each data set can have a purge rule defined, and the xml file is named after the data set: ls /awips2/edex/data/utility/common_static/base/purge/ acarsPurgeRules.xml bufruaPurgeRules.xml pirepPurgeRules.xml acarssoundingPurgeRules.xml ccfpPurgeRules.xml poessoundingPurgeRules.xml aggregatePurgeRules.xml convsigmetPurgeRules.xml pointsetPurgeRules.xml airepPurgeRules.xml cwaPurgeRules.xml profilerPurgeRules.xml ...","title":"Processed Data Purging"},{"location":"edex/data-purge/#time-based-purge","text":"If a plugin has no XML file, the default rule of 1 day (24 hours) is used, from /awips2/edex/data/utility/common_static/base/purge/defaultPurgeRules.xml : 01-00:00:00 Time-based purging is set with the period tag and uses the reference time of the data. The reference time of the data is determined by the decoder.","title":"Time-based purge"},{"location":"edex/data-purge/#30-day-nexrad3-example","text":"Modify /awips2/edex/data/utility/common_static/base/purge/radarPurgeRules.xml to increase the data retention period from 1 to 31 days: 31-00:00:00 Note : you do NOT have to restart EDEX when you change a purge rule!","title":"30-day NEXRAD3 Example"},{"location":"edex/data-purge/#frame-based-purge","text":"Some plugins use frame-base purging, retaining and certain number of product \"versions\". /awips2/edex/data/utility/common_static/base/purge/gridPurgeRules.xml 2 07-00:00:00 LAPS 30 NAM(?:12|20|40) 2 00-00:15:00 ... In the above example, notice a default rule (2) is specified, as well as specific models with their own rules. The tag modTimeToWait can be used in conjunction with versionsToKeep and will increase the versionsToKeep by 1 if data matching this rule has been stored within modTimeToWait.","title":"Frame-Based Purge"},{"location":"edex/data-purge/#purge-logs","text":"Data purge events are logged to the file edex-ingest-purge-[yyyymmdd].log , where [yyyymmdd] is the date stamp. tail -f edex-ingest-purge-20120327.log --------START LOG PURGE--------- INFO 2012-03-27 00:30:00,027 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped file with invalid fileName: afos-trigger.log INFO 2012-03-27 00:30:00,193 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Removed 1 old files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Archived 14 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped processing 1 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::---------END LOG PURGE-----------","title":"Purge Logs"},{"location":"edex/data-purge/#all-purge-rules","text":"To see all purge rule directories (base, site, configured): find /awips2/edex/data/utility -name purge /awips2/edex/data/utility/common_static/base/purge If any overrides have been made, then it's possible that site directories may show up as results from the find command as well.","title":"All Purge Rules"},{"location":"edex/data-purge/#raw-data-purging","text":"Raw data are files that have been brought in by the LDM and recognized by an action in the pqact.conf file. These files are written to subdirectories of /awips2/data_store/ . This data will wait here until it is purged, from the purging rules defined in /awips2/edex/data/utility/common_static/base/archiver/purger/RAW_DATA.xml . If the purge time is too short, and the processing latencies on EDEX are too long, it is possible that EDEX will miss some of this data, and the purge times will need to be adjusted by changing the or tag on the relevent data sets.","title":"Raw Data Purging"},{"location":"edex/data-purge/#default-retention","text":"The defaultRetentionHours tag is defined at the beginning of the RAW_DATA.xml file. It is the duration that will apply to any piece of data that does not fall under an explicitly defined category . The default value for our EDEX is 1 hour: Raw /awips2/data_store/ 1 ...","title":"Default Retention"},{"location":"edex/data-purge/#selected-retention","text":"Data sets are broken up into categories in the RAW_DATA.xml file. These categories are groupings of similar data. Each category has a selectedRetentionHours tag which specifies how long the matching data will be kept for. For example, there is a Model category which sets the purge time to 3 hours for all grib, bufrmos, and modelsounding data: ... Model 3 (grib|grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*) {1} - {6} 2,3,4,5 (bufrmos|modelsounding)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2}) {1} 2,3,4,5 ...","title":"Selected Retention"},{"location":"edex/data-purge/#logging","text":"Raw data purging can be seen in the purge logs as well ( /awips2/edex/logs/edex-ingest-purge-[yyyymmdd].log where [yyyymmdd] is the date stamp). [centos@tg-atm160027-edex-dev purge]$ grep -i 'archive' /awips2/edex/logs/edex-ingest-purge-20200728.log INFO 2020-07-28 20:05:23,959 2329 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\". INFO 2020-07-28 20:05:23,960 2330 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\", deleted 0 files and directories. INFO 2020-07-28 20:05:23,961 2331 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/bufrhdw\" INFO 2020-07-28 20:05:23,963 2332 [Purge-Archive] ArchivePurgeManager: EDEX - Locked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,963 2333 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Products, directory \"/awips2/data_store/xml\". INFO 2020-07-28 20:05:23,964 2334 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Products, directory \"/awips2/data_store/xml\", deleted 5 files and directories. INFO 2020-07-28 20:05:23,967 2335 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,967 2336 [Purge-Archive] ArchivePurger: EDEX - Raw::Archive Purged 28387 files in 23.8s. INFO 2020-07-28 20:05:23,979 2337 [Purge-Archive] ArchivePurgeManager: EDEX - Purging directory: \"/awips2/edex/data/archive\". INFO 2020-07-28 20:05:23,992 2338 [Purge-Archive] ArchivePurger: EDEX - Processed::Archive Purged 0 files in 25ms. INFO 2020-07-28 20:05:23,992 2339 [Purge-Archive] ArchivePurger: EDEX - Archive Purge finished. Time to run: 23.9s ...","title":"Logging"},{"location":"edex/data-radar/","text":"Level 3 Radar (All) \uf0c1 NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad Level 3 Radar (Subset) \uf0c1 NEXRAD3 ^(SDUS[23578].) .... (......) /p(DHR|DPR|DSP|DTA|DAA|DU3|DU6|DVL|EET|HHC|N3P|N0C|N0K|N0Q|N0S|N0U|N0X|N0Z|NCR|NMD|OHA)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad FNEXRAD Composites \uf0c1 FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....) FILE -close -edex /awips2/data_store/sat/nexrcomp_\\3\\4_\\2.gini.png WSR-88D Localizations \uf0c1 WFO","title":"Data radar"},{"location":"edex/data-radar/#level-3-radar-all","text":"NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad","title":"Level 3 Radar (All)"},{"location":"edex/data-radar/#level-3-radar-subset","text":"NEXRAD3 ^(SDUS[23578].) .... (......) /p(DHR|DPR|DSP|DTA|DAA|DU3|DU6|DVL|EET|HHC|N3P|N0C|N0K|N0Q|N0S|N0U|N0X|N0Z|NCR|NMD|OHA)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad","title":"Level 3 Radar (Subset)"},{"location":"edex/data-radar/#fnexrad-composites","text":"FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....) FILE -close -edex /awips2/data_store/sat/nexrcomp_\\3\\4_\\2.gini.png","title":"FNEXRAD Composites"},{"location":"edex/data-radar/#wsr-88d-localizations","text":"WFO","title":"WSR-88D Localizations"},{"location":"edex/data-satellite/","text":"Satellite Imagery \uf0c1 NOAAport GINI Images \uf0c1 NIMAGE ^satz/ch[0-9]/.*/(.*)/([12][0-9])([0-9][0-9])([01][0-9])([0-3][0-9]) ([0-2][0-9])([0-5][0-9])/(.*)/(.*km)/ FILE -close -overwrite -edex /awips2/data_store/sat/\\8/\\9/\\1_\\2\\3\\4\\5_\\6\\7 UNIWISC GOES-East/West Northern Hemisphere Composites \uf0c1 # GOES-East/West VIS composites UNIWISC ^pnga2area Q. (CV) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_VIS_VIS_\\6_\\7 # GOES-East/West 3.9 um composites UNIWISC ^pnga2area Q. (CS) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_3.9_3.9_\\6_\\7 # GOES-East/West WV composites UNIWISC ^pnga2area Q. (CW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_WV_WV_\\6_\\7 # GOES-East/West IR composites UNIWISC ^pnga2area Q. (CI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_IR_IR_\\6_\\7 # GOES-East/West 13.3 um composites UNIWISC ^pnga2area Q. (CL) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_13.3_13.3_\\6_\\7 20km Rectilinear Global Composites \uf0c1 # Global WV composite UNIWISC ^pnga2area Q. (GW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_WV_WVCOMP_\\6_\\7 # Global IR composites UNIWISC ^pnga2area Q. (GI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_IR_IRCOMP_\\6_\\7 30km Mollweide Global Composites \uf0c1 # Mollweide Global Water Vapor UNIWISC ^pnga2area Q. (UY) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_WV_MOLLWV_\\6_\\7 # Mollweide Global IR UNIWISC ^pnga2area Q. (UX) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_IR_MOLLIR_\\6_\\7 # These work # GOES Visible (UV 4km VIS disabled) UNIWISC ^pnga2area Q. (EV|U9) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_VIS_\\4_\\6_\\7 # GOES Water Vapor UNIWISC ^pnga2area Q. (UW|UB) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_WV_\\4_\\6_\\7 # GOES Thermal Infrared UNIWISC ^pnga2area Q. (UI|U5) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_IR_\\4_\\6_\\7 # GOES other UNIWISC ^pnga2area Q. (UD|UE|U7|U8|) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_\\4_\\6_\\7 Arctic Composite Imagery \uf0c1 UNIWISC ^pnga2area Q. (U[LNGHO]) (.*) (.*) (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ARCTIC_4km_\\4_\\6_\\7 Antarctic Composite Imagery \uf0c1 # Antarctic VIS Composite UNIWISC ^pnga2area Q. (UJ) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_VIS_\\3_\\4_\\6_\\7 # Antarctic PCOL Composite UNIWISC ^pnga2area Q. (UK) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_PCOL_\\3_\\4_\\6_\\7 # Antarctic WV Composite UNIWISC ^pnga2area Q. (UF) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_WV_\\3_\\4_\\6_\\7 # Antarctic Composite IR UNIWISC ^pnga2area Q. (U1) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_IR_\\3_\\4_\\6_\\7 GOES Sounder Derived Imagery \uf0c1 # CIMSS CAPE - McIDAS product code CE UNIWISC ^pnga2area Q0 CE .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CAPE_\\4_\\5 # CIMSS Cloud Top Pressure - McIDAS product code CA UNIWISC ^pnga2area Q0 CA .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CTP_\\4_\\5 # CIMSS Lifted Index - McIDAS product code CD UNIWISC ^pnga2area Q0 CD .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_LI_\\4_\\5 # CIMSS Ozone - McIDAS product code CF UNIWISC ^pnga2area Q0 CF .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_OZONE_\\4_\\5 # CIMSS Total Column Precipitable Water - McIDAS product code CB UNIWISC ^pnga2area Q0 CB .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_PW_\\4_\\5 # CIMSS Sea Surface Temperature - McIDAS product code CC UNIWISC ^pnga2area Q0 CC .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_SST_\\4_\\5 # CIMSS Northern Hemisphere Wildfire ABBA - McIDAS product code CG (inactive) UNIWISC ^pnga2area Q0 CG (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESNH_\\4_\\5 # CIMSS Southern Hemisphere Wildfire ABBA - McIDAS product code CH (inactive) UNIWISC ^pnga2area Q0 CH (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESSH_\\4_\\5","title":"Satellite Imagery"},{"location":"edex/data-satellite/#satellite-imagery","text":"","title":"Satellite Imagery"},{"location":"edex/data-satellite/#noaaport-gini-images","text":"NIMAGE ^satz/ch[0-9]/.*/(.*)/([12][0-9])([0-9][0-9])([01][0-9])([0-3][0-9]) ([0-2][0-9])([0-5][0-9])/(.*)/(.*km)/ FILE -close -overwrite -edex /awips2/data_store/sat/\\8/\\9/\\1_\\2\\3\\4\\5_\\6\\7","title":"NOAAport GINI Images"},{"location":"edex/data-satellite/#uniwisc-goes-eastwest-northern-hemisphere-composites","text":"# GOES-East/West VIS composites UNIWISC ^pnga2area Q. (CV) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_VIS_VIS_\\6_\\7 # GOES-East/West 3.9 um composites UNIWISC ^pnga2area Q. (CS) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_3.9_3.9_\\6_\\7 # GOES-East/West WV composites UNIWISC ^pnga2area Q. (CW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_WV_WV_\\6_\\7 # GOES-East/West IR composites UNIWISC ^pnga2area Q. (CI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_IR_IR_\\6_\\7 # GOES-East/West 13.3 um composites UNIWISC ^pnga2area Q. (CL) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_13.3_13.3_\\6_\\7","title":"UNIWISC GOES-East/West Northern Hemisphere Composites"},{"location":"edex/data-satellite/#20km-rectilinear-global-composites","text":"# Global WV composite UNIWISC ^pnga2area Q. (GW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_WV_WVCOMP_\\6_\\7 # Global IR composites UNIWISC ^pnga2area Q. (GI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_IR_IRCOMP_\\6_\\7","title":"20km Rectilinear Global Composites"},{"location":"edex/data-satellite/#30km-mollweide-global-composites","text":"# Mollweide Global Water Vapor UNIWISC ^pnga2area Q. (UY) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_WV_MOLLWV_\\6_\\7 # Mollweide Global IR UNIWISC ^pnga2area Q. (UX) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_IR_MOLLIR_\\6_\\7 # These work # GOES Visible (UV 4km VIS disabled) UNIWISC ^pnga2area Q. (EV|U9) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_VIS_\\4_\\6_\\7 # GOES Water Vapor UNIWISC ^pnga2area Q. (UW|UB) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_WV_\\4_\\6_\\7 # GOES Thermal Infrared UNIWISC ^pnga2area Q. (UI|U5) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_IR_\\4_\\6_\\7 # GOES other UNIWISC ^pnga2area Q. (UD|UE|U7|U8|) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_\\4_\\6_\\7","title":"30km Mollweide Global Composites"},{"location":"edex/data-satellite/#arctic-composite-imagery","text":"UNIWISC ^pnga2area Q. (U[LNGHO]) (.*) (.*) (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ARCTIC_4km_\\4_\\6_\\7","title":"Arctic Composite Imagery"},{"location":"edex/data-satellite/#antarctic-composite-imagery","text":"# Antarctic VIS Composite UNIWISC ^pnga2area Q. (UJ) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_VIS_\\3_\\4_\\6_\\7 # Antarctic PCOL Composite UNIWISC ^pnga2area Q. (UK) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_PCOL_\\3_\\4_\\6_\\7 # Antarctic WV Composite UNIWISC ^pnga2area Q. (UF) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_WV_\\3_\\4_\\6_\\7 # Antarctic Composite IR UNIWISC ^pnga2area Q. (U1) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_IR_\\3_\\4_\\6_\\7","title":"Antarctic Composite Imagery"},{"location":"edex/data-satellite/#goes-sounder-derived-imagery","text":"# CIMSS CAPE - McIDAS product code CE UNIWISC ^pnga2area Q0 CE .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CAPE_\\4_\\5 # CIMSS Cloud Top Pressure - McIDAS product code CA UNIWISC ^pnga2area Q0 CA .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CTP_\\4_\\5 # CIMSS Lifted Index - McIDAS product code CD UNIWISC ^pnga2area Q0 CD .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_LI_\\4_\\5 # CIMSS Ozone - McIDAS product code CF UNIWISC ^pnga2area Q0 CF .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_OZONE_\\4_\\5 # CIMSS Total Column Precipitable Water - McIDAS product code CB UNIWISC ^pnga2area Q0 CB .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_PW_\\4_\\5 # CIMSS Sea Surface Temperature - McIDAS product code CC UNIWISC ^pnga2area Q0 CC .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_SST_\\4_\\5 # CIMSS Northern Hemisphere Wildfire ABBA - McIDAS product code CG (inactive) UNIWISC ^pnga2area Q0 CG (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESNH_\\4_\\5 # CIMSS Southern Hemisphere Wildfire ABBA - McIDAS product code CH (inactive) UNIWISC ^pnga2area Q0 CH (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESSH_\\4_\\5","title":"GOES Sounder Derived Imagery"},{"location":"edex/derived-parameters/","text":"AWIPS will calculate derived parameters using XML file definitions which refer to Python scripts where the actual calculations take place. If and when there is an effort to verify calculated fields in Unidata Python packages, these should come in handy (along with the GEMPAK FORTRAN routines). For gridded data there are three directories to know which contain derived parm XML files: awips2 com.raytheon.uf.common.dataplugin.grid https://github.com/Unidata/awips2/tree/unidata_16.1.5/edexOsgi/com.raytheon.uf.common.dataplugin.grid/utility/common_static/base/derivedParameters/definitions awips2-core com.raytheon.uf.common.derivparam https://github.com/Unidata/awips2-core/tree/unidata_16.1.4/common/com.raytheon.uf.common.derivparam/utility/common_static/base/derivedParameters/definitions awips2-core com.raytheon.uf.common.derivparam.python https://github.com/Unidata/awips2-core/tree/unidata_16.1.4/common/com.raytheon.uf.common.derivparam.python/utility/common_static/base/derivedParameters Notice the first is from the \"awips2\" repo, while the others are from \"awips2-core\", so if a derived parm field is not showing up in directory listings or search results for one repo, be sure to search the other. Helicity, for example, from https://github.com/Unidata/awips2-core/blob/unidata_16.1.4/common/com.raytheon.uf.common.derivparam/utility/common_static/base/derivedParameters/definitions/Heli.xml The first three blocks act as constructors allowing for different levels to be passed the main method, which determines the calculation to be performed. Name=\"Heli\" here refers to the derivedParameter file Heli.py located at https://github.com/Unidata/awips2-core/blob/unidata_16.1.4/common/com.raytheon.uf.common.derivparam.python/utility/common_static/base/derivedParameters/functions/Heli.py So three fields will be passed to Heli.py for the calculation (uWStk, vWStk, RM5), but each is a derived paremeter as well, with its own xml (let's go deeper). uWStk https://github.com/Unidata/awips2/blob/unidata_16.1.5/edexOsgi/com.raytheon.uf.common.dataplugin.grid/utility/common_static/base/derivedParameters/definitions/uStk.xml vWStk https://github.com/Unidata/awips2/blob/unidata_16.1.5/edexOsgi/com.raytheon.uf.common.dataplugin.grid/utility/common_static/base/derivedParameters/definitions/vStk.xml RM5 https://github.com/Unidata/awips2/blob/unidata_16.1.5/edexOsgi/com.raytheon.uf.common.dataplugin.grid/utility/common_static/base/derivedParameters/definitions/RM5.xml Finally, the Python function Heli.py https://github.com/Unidata/awips2-core/blob/unidata_16.1.4/common/com.raytheon.uf.common.derivparam.python/utility/common_static/base/derivedParameters/functions/Heli.py def execute(uStk, vStk, RM5): umot,vmot = RM5 u1 = uStk[0] v1 = vStk[0] u2 = uStk[-1] v2 = vStk[-1] # First do our motion, lower bulk shear computation. hptr = (v2-v1)*umot+(u1-u2)*vmot for i in range(1, len(uStk)): u1 = uStk[i-1] v1 = vStk[i-1] u2 = uStk[i] v2 = vStk[i] hptr += u2*v1-u1*v2 return hptr from the three files above, we can see that the grid fields used to calculate helicity are uW, vW, USTM, and VSTM, with the last two used by yet another function Vector.py: https://github.com/Unidata/awips2-core/blob/unidata_16.1.4/common/com.raytheon.uf.common.derivparam.python/utility/common_static/base/derivedParameters/functions/Vector.py Method names like Or, Union, and Alias that don't map to Python functions are handled by Java in com.raytheon.uf.common.derivparam (see https://github.com/Unidata/awips2-core/tree/unidata_16.1.4/common/com.raytheon.uf.common.derivparam/src/com/raytheon/uf/common/derivparam/tree)","title":"Derived parameters"},{"location":"edex/distributed-computing/","text":"Distributed EDEX \uf0c1 AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS could consist of a dozen servers or more, the early Unidata releases were stripped of operations-specific configurations and plugins, and released as a standalone server. This worked, since (at the time) a single EDEX instance with an attached SSD could handle most of NOAAport. However, with GOES-R(16) coming online in 2017, and more gridded forecast models being created at finer temporal and spatial resolutions, there is now a need to distribute the data decoding across multiple machines to handle this firehose of data. Unidata's Current EDEX Server \uf0c1 Currently, with our specific EDEX server we use a Database/Request instance that also decodes and ingests a good portion of the data. It handles all data requests from CAVE users, as well as the majority of the decoding and ingesting for data feeds coming down on the LDM. The radar data has been specifically exluded (from the decoding and ingest) and it has its own Ingest/Decode Server which is explained in more detail below. For our EDEX we have designated an instance of the ingest/decoding server to be dedicated to handling the radar data. Our Radar-EDEX recieves and decodes all radar down from the LDM and then stores it back on our main Database/Request EDEX in the form of HDF5 data files and PostgreSQL metadata. Example Installation \uf0c1 This walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to store and serve while the second is used to ingest and decode data. Database/Request Server \uf0c1 For this example, this server will be referred to by the IP address 10.0.0.9 . 1. Install \uf0c1 wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --database 2. IPtables Config \uf0c1 It is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is not recommended that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it is recommended that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server). vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT -A INPUT -s 10.0.0.7 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT -A EDEX -j REJECT COMMIT Note the line -A INPUT -s 10.0.0.7 -j EDEX as well as the following -A EDEX ... rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5). The two ports left open to all connections (9581,9582) in addition to default port 22 are for outside CAVE client connections 3. Database Config \uf0c1 In the file /awips2/database/data/pg_hba.conf you define remote connections for all postgres tables with as /32 , after the block of IPv4 local connections and generic for hostnossl: vi /awips2/database/data/pg_hba.conf # \"local\" is for Unix domain socket connections only local all all trust hostssl all all 162.0.0.0/8 cert clientcert=1 hostssl all all 127.0.0.1/32 cert clientcert=1 hostssl all all 10.0.0.7/32 cert clientcert=1 hostnossl postgres all 10.0.0.0/24 md5 hostnossl fxatext all 10.0.0.0/24 md5 hostnossl metadata all 10.0.0.0/24 md5 # IPv6 local connections: hostssl all all ::1/128 cert clientcert=1 hostnossl all all ::1/128 md5 4. Start EDEX \uf0c1 edex start database This will start PostgreSQL, httpd-pypies, Qpid, and the EDEX Request JVM (and will not start the LDM or the EDEX Ingest and IngestGrib JVMs) 5. Monitor Services \uf0c1 The command edex will show which services are running, and for a Database/Request server, will not include the LDM, EDEXingest, or EDEXgrib: edex [edex status] postgres :: running :: pid 571 pypies :: running :: pid 639 qpid :: running :: pid 674 EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: running :: pid 987 1029 23792 Since this Database/Request server is not running the main edexIngest JVM, we won't see anything from edex log , instead watch the Request Server with the command edex log request Confirm that EDEX Request connects to PostgreSQL! With the above edex log request , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-request, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-request, auth-request, awipstools-request, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to PostgreSQL - double-check DB_ADDR in /awips2/edex/bin/setup.env Ancillary EDEX Server (Ingest/Decode EDEX Server) \uf0c1 For this example, this server will be referred to by the IP address 10.0.0.7 . The Main EDEX server will be referred to by the IP address 10.0.0.9 . 1. Install \uf0c1 wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --ingest 2. EDEX Config \uf0c1 vi /awips2/edex/bin/setup.env Here you should redefine DB_ADDR and PYPIES_SERVER to point to the Main or Database/Request server (10.0.0.9) and the EXT_ADDR to point to the current Ingest server (10.0.0.7) export EXT_ADDR=10.0.0.7 # postgres connection export DB_ADDR=10.0.0.9 export DB_PORT=5432 # pypies hdf5 connection export PYPIES_SERVER=http://10.0.0.9:9582 # qpid connection export BROKER_ADDR=${EXT_ADDR} Notice that EXT_ADDR and BROKER_ADDR (qpid) should remain defined as the localhost IP address (10.0.0.7) 3. Modify the edexServiceList \uf0c1 Most likely if you are running a distributed EDEX setup, you are only processing a subset of data. You can change your edexServiceList to only run the processes you need. You will need to update the /etc/init.d/edexServiceList file. For example replace the services with the associated right column based on the data you're processing: export SERVICES=('') Data Processing: edexServiceList radar ingestRadar satellite ingestGoesR model ingestGrids, ingestGrib 4. Configure your LDM \uf0c1 You'll want to modify your pqact.conf file to store only the data you want processed. There are example files in /awips2/ldm/etc that you can copy over to the main pqact.conf file. For example if you are wanting to process goesr data only, you can do the following steps: cd /awips2/ldm/etc mv pqact.conf pqact.conf.orig cp pqact.goesr pqact.conf You will also want to edit the pqact.conf file on your Main EDEX and comment out any entries you're processing on this EDEX server. 5. Start EDEX \uf0c1 edex start This will start LDM, Qpid and the specified EDEX Ingest JVMs (and not start PostgreSQL, httpd-pypies, or the EDEX Request JVM) 4. Monitor Services \uf0c1 Watch the edex JVM log with the command edex log Confirm that EDEX connects to PostgreSQL! With the above edex log , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-ingest, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-ingest, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to the remote PostgreSQL instance - double-check DB_ADDR in /awips2/edex/bin/setup.env You can manually check remote PostgreSQL connectivity on any EDEX Ingest server from the command line: su - awips psql -U awips -h -p 5432 metadata Where the default passwd is awips and is defined in files in /awips2/edex/conf/db/hibernateConfig/ Additional Notes \uf0c1 Be mindful of what IP address and hostnames are used in /awips2/edex/bin/setup.env and /awips2/database/data/pg_hba.conf , and that they are resolvable from the command line. Consult or edit /etc/hosts as needed. You can install multiple awips2-ingest servers, each decoding a different dataset or feed, all pointing to the same Database/Request server ( DB_ADDR and PYPIES_SERVER in /awips2/edex/bin/setup.env ): Every EDEX Ingest IP address must be allowed in both iptables and pg_hba.conf as shown above . Data processed on","title":"Distributed EDEX"},{"location":"edex/distributed-computing/#distributed-edex","text":"AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS could consist of a dozen servers or more, the early Unidata releases were stripped of operations-specific configurations and plugins, and released as a standalone server. This worked, since (at the time) a single EDEX instance with an attached SSD could handle most of NOAAport. However, with GOES-R(16) coming online in 2017, and more gridded forecast models being created at finer temporal and spatial resolutions, there is now a need to distribute the data decoding across multiple machines to handle this firehose of data.","title":"Distributed EDEX"},{"location":"edex/distributed-computing/#unidatas-current-edex-server","text":"Currently, with our specific EDEX server we use a Database/Request instance that also decodes and ingests a good portion of the data. It handles all data requests from CAVE users, as well as the majority of the decoding and ingesting for data feeds coming down on the LDM. The radar data has been specifically exluded (from the decoding and ingest) and it has its own Ingest/Decode Server which is explained in more detail below. For our EDEX we have designated an instance of the ingest/decoding server to be dedicated to handling the radar data. Our Radar-EDEX recieves and decodes all radar down from the LDM and then stores it back on our main Database/Request EDEX in the form of HDF5 data files and PostgreSQL metadata.","title":"Unidata's Current EDEX Server"},{"location":"edex/distributed-computing/#example-installation","text":"This walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to store and serve while the second is used to ingest and decode data.","title":"Example Installation"},{"location":"edex/distributed-computing/#databaserequest-server","text":"For this example, this server will be referred to by the IP address 10.0.0.9 .","title":"Database/Request Server"},{"location":"edex/distributed-computing/#1-install","text":"wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --database","title":"1. Install"},{"location":"edex/distributed-computing/#2-iptables-config","text":"It is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is not recommended that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it is recommended that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server). vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT -A INPUT -s 10.0.0.7 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT -A EDEX -j REJECT COMMIT Note the line -A INPUT -s 10.0.0.7 -j EDEX as well as the following -A EDEX ... rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5). The two ports left open to all connections (9581,9582) in addition to default port 22 are for outside CAVE client connections","title":"2. IPtables Config"},{"location":"edex/distributed-computing/#3-database-config","text":"In the file /awips2/database/data/pg_hba.conf you define remote connections for all postgres tables with as /32 , after the block of IPv4 local connections and generic for hostnossl: vi /awips2/database/data/pg_hba.conf # \"local\" is for Unix domain socket connections only local all all trust hostssl all all 162.0.0.0/8 cert clientcert=1 hostssl all all 127.0.0.1/32 cert clientcert=1 hostssl all all 10.0.0.7/32 cert clientcert=1 hostnossl postgres all 10.0.0.0/24 md5 hostnossl fxatext all 10.0.0.0/24 md5 hostnossl metadata all 10.0.0.0/24 md5 # IPv6 local connections: hostssl all all ::1/128 cert clientcert=1 hostnossl all all ::1/128 md5","title":"3. Database Config"},{"location":"edex/distributed-computing/#4-start-edex","text":"edex start database This will start PostgreSQL, httpd-pypies, Qpid, and the EDEX Request JVM (and will not start the LDM or the EDEX Ingest and IngestGrib JVMs)","title":"4. Start EDEX"},{"location":"edex/distributed-computing/#5-monitor-services","text":"The command edex will show which services are running, and for a Database/Request server, will not include the LDM, EDEXingest, or EDEXgrib: edex [edex status] postgres :: running :: pid 571 pypies :: running :: pid 639 qpid :: running :: pid 674 EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: running :: pid 987 1029 23792 Since this Database/Request server is not running the main edexIngest JVM, we won't see anything from edex log , instead watch the Request Server with the command edex log request Confirm that EDEX Request connects to PostgreSQL! With the above edex log request , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-request, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-request, auth-request, awipstools-request, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to PostgreSQL - double-check DB_ADDR in /awips2/edex/bin/setup.env","title":"5. Monitor Services"},{"location":"edex/distributed-computing/#ancillary-edex-server-ingestdecode-edex-server","text":"For this example, this server will be referred to by the IP address 10.0.0.7 . The Main EDEX server will be referred to by the IP address 10.0.0.9 .","title":"Ancillary EDEX Server (Ingest/Decode EDEX Server)"},{"location":"edex/distributed-computing/#1-install_1","text":"wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --ingest","title":"1. Install"},{"location":"edex/distributed-computing/#2-edex-config","text":"vi /awips2/edex/bin/setup.env Here you should redefine DB_ADDR and PYPIES_SERVER to point to the Main or Database/Request server (10.0.0.9) and the EXT_ADDR to point to the current Ingest server (10.0.0.7) export EXT_ADDR=10.0.0.7 # postgres connection export DB_ADDR=10.0.0.9 export DB_PORT=5432 # pypies hdf5 connection export PYPIES_SERVER=http://10.0.0.9:9582 # qpid connection export BROKER_ADDR=${EXT_ADDR} Notice that EXT_ADDR and BROKER_ADDR (qpid) should remain defined as the localhost IP address (10.0.0.7)","title":"2. EDEX Config"},{"location":"edex/distributed-computing/#3-modify-the-edexservicelist","text":"Most likely if you are running a distributed EDEX setup, you are only processing a subset of data. You can change your edexServiceList to only run the processes you need. You will need to update the /etc/init.d/edexServiceList file. For example replace the services with the associated right column based on the data you're processing: export SERVICES=('') Data Processing: edexServiceList radar ingestRadar satellite ingestGoesR model ingestGrids, ingestGrib","title":"3. Modify the edexServiceList"},{"location":"edex/distributed-computing/#4-configure-your-ldm","text":"You'll want to modify your pqact.conf file to store only the data you want processed. There are example files in /awips2/ldm/etc that you can copy over to the main pqact.conf file. For example if you are wanting to process goesr data only, you can do the following steps: cd /awips2/ldm/etc mv pqact.conf pqact.conf.orig cp pqact.goesr pqact.conf You will also want to edit the pqact.conf file on your Main EDEX and comment out any entries you're processing on this EDEX server.","title":"4. Configure your LDM"},{"location":"edex/distributed-computing/#5-start-edex","text":"edex start This will start LDM, Qpid and the specified EDEX Ingest JVMs (and not start PostgreSQL, httpd-pypies, or the EDEX Request JVM)","title":"5. Start EDEX"},{"location":"edex/distributed-computing/#4-monitor-services","text":"Watch the edex JVM log with the command edex log Confirm that EDEX connects to PostgreSQL! With the above edex log , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-ingest, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-ingest, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to the remote PostgreSQL instance - double-check DB_ADDR in /awips2/edex/bin/setup.env You can manually check remote PostgreSQL connectivity on any EDEX Ingest server from the command line: su - awips psql -U awips -h -p 5432 metadata Where the default passwd is awips and is defined in files in /awips2/edex/conf/db/hibernateConfig/","title":"4. Monitor Services"},{"location":"edex/distributed-computing/#additional-notes","text":"Be mindful of what IP address and hostnames are used in /awips2/edex/bin/setup.env and /awips2/database/data/pg_hba.conf , and that they are resolvable from the command line. Consult or edit /etc/hosts as needed. You can install multiple awips2-ingest servers, each decoding a different dataset or feed, all pointing to the same Database/Request server ( DB_ADDR and PYPIES_SERVER in /awips2/edex/bin/setup.env ): Every EDEX Ingest IP address must be allowed in both iptables and pg_hba.conf as shown above . Data processed on","title":"Additional Notes"},{"location":"edex/edex-ingest-docker-container/","text":"Docker EDEX \uf0c1 Project home: https://github.com/Unidata/edex-docker EDEX can be run inside a docker container, which allows you to process data into an AWIPS system without requiring accessing and altering the machine's native CentOS installation and configuration. The EDEX Docker Image is built on CentOS 7 and contains the latest Unidata AWIPS release (18.1.1). This container is an ingest-only install, meaning there is no database or request server . This example requires a Database/Request server be configured for you to access remotely. See the Distributed EDEX document for more. Download and Install Docker \uf0c1 Download and install Docker and Docker Compose: Docker for CentOS 7 Linux Docker for Mac Docker for Windows docker-compose (it should be bundled with Docker by default on Mac and Windows) Run the EDEX Ingest Container \uf0c1 Clone the source repository: git clone https://github.com/Unidata/edex-docker.git cd edex-docker Run the container with docker-compose: docker-compose up -d edex-ingest Confirm the container is running: docker ps -a Enter the container: docker exec -it edex-ingest bash Stop the container: docker-compose stop Delete the container (keep the image): docker-compose rm -f Run commands inside the container, such as: docker exec edex-ingest edex which should return something like: [edex status] qpid :: running :: pid 22474 EDEXingest :: running :: pid 21860 31513 EDEXgrib :: not running ldmadmin :: running :: pid 22483 edex (status|start|stop|setup|log|purge|qpid|users) To update to the latest version and restart: docker pull unidata/edex-ingest:latest docker-compose stop docker-compose up -d edex-ingest Configuration and Customization \uf0c1 The file docker-compose.yml defines files to mount to the container and which ports to open: edex-ingest: image: unidata/edex-ingest:latest container_name: edex-ingest volumes: - ./etc/ldmd.conf:/awips2/ldm/etc/ldmd.conf - ./etc/pqact.conf:/awips2/ldm/etc/pqact.conf - ./bin/setup.env:/awips2/edex/bin/setup.env - ./bin/runedex.sh:/awips2/edex/bin/runedex.sh ports: - \"388:388\" ulimits: nofile: soft: 1024 hard: 1024 Mounted Files \uf0c1 etc/ldmd.conf \uf0c1 Defines which data feeds to receive. By default there is only one active request line ( REQUEST IDS|DDPLUS \".*\" idd.unidata.ucar.edu ) to not overwhelm small EDEX containers ingesting large volumes of radar and gridded data files. Any updates to the file etc/ldmd.conf will be read the next time you restart the container. etc/pqact.conf \uf0c1 Defines how products are processed and where they are written to on the filesystem. This is the full set of pattern actions used in Unidata AWIPS, and generally you do not need to edit this file. Instead control which data feeds are requested in ldmd.conf (above). bin/setup.env \uf0c1 Defines the remote EDEX Database/Request server: ### EDEX localization related variables ### export AW_SITE_IDENTIFIER=OAX export EXT_ADDR=js-157-198.jetstream-cloud.org Note : EXT_ADDR must be set to an allowed EDEX Database/Request Server. In this example we are using a JetStream Cloud instance, which controls our edex-ingest access with IPtables, SSL certificates, and PostgreSQL pg_hba.conf rules. This server will not allow outside connections, you must change this to point to an appropriate server. bin/runedex.sh \uf0c1 The default script run when the container is started, acts as a sort-of service manager for EDEX and the LDM (see ENTRYPOINT [\"/awips2/edex/bin/runedex.sh\"] in Dockerfile.edex ), essentially: /awips2/qpid/bin/qpid-wrapper & /awips2/edex/bin/start.sh -noConsole ingest & ldmadmin mkqueue ldmadmin start","title":"Docker EDEX"},{"location":"edex/edex-ingest-docker-container/#docker-edex","text":"Project home: https://github.com/Unidata/edex-docker EDEX can be run inside a docker container, which allows you to process data into an AWIPS system without requiring accessing and altering the machine's native CentOS installation and configuration. The EDEX Docker Image is built on CentOS 7 and contains the latest Unidata AWIPS release (18.1.1). This container is an ingest-only install, meaning there is no database or request server . This example requires a Database/Request server be configured for you to access remotely. See the Distributed EDEX document for more.","title":"Docker EDEX"},{"location":"edex/edex-ingest-docker-container/#download-and-install-docker","text":"Download and install Docker and Docker Compose: Docker for CentOS 7 Linux Docker for Mac Docker for Windows docker-compose (it should be bundled with Docker by default on Mac and Windows)","title":"Download and Install Docker"},{"location":"edex/edex-ingest-docker-container/#run-the-edex-ingest-container","text":"Clone the source repository: git clone https://github.com/Unidata/edex-docker.git cd edex-docker Run the container with docker-compose: docker-compose up -d edex-ingest Confirm the container is running: docker ps -a Enter the container: docker exec -it edex-ingest bash Stop the container: docker-compose stop Delete the container (keep the image): docker-compose rm -f Run commands inside the container, such as: docker exec edex-ingest edex which should return something like: [edex status] qpid :: running :: pid 22474 EDEXingest :: running :: pid 21860 31513 EDEXgrib :: not running ldmadmin :: running :: pid 22483 edex (status|start|stop|setup|log|purge|qpid|users) To update to the latest version and restart: docker pull unidata/edex-ingest:latest docker-compose stop docker-compose up -d edex-ingest","title":"Run the EDEX Ingest Container"},{"location":"edex/edex-ingest-docker-container/#configuration-and-customization","text":"The file docker-compose.yml defines files to mount to the container and which ports to open: edex-ingest: image: unidata/edex-ingest:latest container_name: edex-ingest volumes: - ./etc/ldmd.conf:/awips2/ldm/etc/ldmd.conf - ./etc/pqact.conf:/awips2/ldm/etc/pqact.conf - ./bin/setup.env:/awips2/edex/bin/setup.env - ./bin/runedex.sh:/awips2/edex/bin/runedex.sh ports: - \"388:388\" ulimits: nofile: soft: 1024 hard: 1024","title":"Configuration and Customization"},{"location":"edex/edex-ingest-docker-container/#mounted-files","text":"","title":"Mounted Files"},{"location":"edex/edex-ingest-docker-container/#etcldmdconf","text":"Defines which data feeds to receive. By default there is only one active request line ( REQUEST IDS|DDPLUS \".*\" idd.unidata.ucar.edu ) to not overwhelm small EDEX containers ingesting large volumes of radar and gridded data files. Any updates to the file etc/ldmd.conf will be read the next time you restart the container.","title":"etc/ldmd.conf"},{"location":"edex/edex-ingest-docker-container/#etcpqactconf","text":"Defines how products are processed and where they are written to on the filesystem. This is the full set of pattern actions used in Unidata AWIPS, and generally you do not need to edit this file. Instead control which data feeds are requested in ldmd.conf (above).","title":"etc/pqact.conf"},{"location":"edex/edex-ingest-docker-container/#binsetupenv","text":"Defines the remote EDEX Database/Request server: ### EDEX localization related variables ### export AW_SITE_IDENTIFIER=OAX export EXT_ADDR=js-157-198.jetstream-cloud.org Note : EXT_ADDR must be set to an allowed EDEX Database/Request Server. In this example we are using a JetStream Cloud instance, which controls our edex-ingest access with IPtables, SSL certificates, and PostgreSQL pg_hba.conf rules. This server will not allow outside connections, you must change this to point to an appropriate server.","title":"bin/setup.env"},{"location":"edex/edex-ingest-docker-container/#binrunedexsh","text":"The default script run when the container is started, acts as a sort-of service manager for EDEX and the LDM (see ENTRYPOINT [\"/awips2/edex/bin/runedex.sh\"] in Dockerfile.edex ), essentially: /awips2/qpid/bin/qpid-wrapper & /awips2/edex/bin/start.sh -noConsole ingest & ldmadmin mkqueue ldmadmin start","title":"bin/runedex.sh"},{"location":"edex/edex-users/","text":"Monitor Users \uf0c1 To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu ... Logging Daily EDEX Users \uf0c1 To get a running log of who has accessed EDEX, you can create a short script. The example below is a script that runs once daily at 20 minutes after 00 UTC, appending each day's edex users list to a logfile /home/awips/edex-users.log : vi~/edexUsers.sh #!/bin/bash /awips2/edex/bin/edex users >> /home/awips/edex-users.log crontab -e 0 20 * * * /home/awips/edexUsers.sh 1>> /dev/null 2>&1","title":"Monitor Users"},{"location":"edex/edex-users/#monitor-users","text":"To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu ...","title":"Monitor Users"},{"location":"edex/edex-users/#logging-daily-edex-users","text":"To get a running log of who has accessed EDEX, you can create a short script. The example below is a script that runs once daily at 20 minutes after 00 UTC, appending each day's edex users list to a logfile /home/awips/edex-users.log : vi~/edexUsers.sh #!/bin/bash /awips2/edex/bin/edex users >> /home/awips/edex-users.log crontab -e 0 20 * * * /home/awips/edexUsers.sh 1>> /dev/null 2>&1","title":"Logging Daily EDEX Users"},{"location":"edex/ldm-gempak/","text":"LDM for AWIPS and GEMPAK \uf0c1 It is possible to have two LDM installs (since AWIPS LDM installs to /awips2/ldm and is owned and run by user awips:awips . But two LDM clients doubles your bandwidth and uncessary. This document explains how the LDM keeps its EDEX processing separate from other processing (GEMPAK decoders, TDS, etc.). /awips2/ldm/etc/ldmd.conf \uf0c1 The defailt AWIPS LDM config file executes a single pqact process with the -e flag (for EDEX). The edexBridge server name is defined (typically the local machine name): EXEC \"pqact -e /awips2/ldm/etc/pqact.conf\" EXEC \"edexBridge -s $HOSTNAME\" A separate EXEC line should be added for GEMPAK decoders, without the -e flag: EXEC \"pqact -f IDS-DDPLUS /awips2/ldm/etc/pqact.gempak_decoders\" yum install apr-devel apr-util-devel \"libdb-4.7.so()(64bit)\"","title":"LDM for AWIPS and GEMPAK"},{"location":"edex/ldm-gempak/#ldm-for-awips-and-gempak","text":"It is possible to have two LDM installs (since AWIPS LDM installs to /awips2/ldm and is owned and run by user awips:awips . But two LDM clients doubles your bandwidth and uncessary. This document explains how the LDM keeps its EDEX processing separate from other processing (GEMPAK decoders, TDS, etc.).","title":"LDM for AWIPS and GEMPAK"},{"location":"edex/ldm-gempak/#awips2ldmetcldmdconf","text":"The defailt AWIPS LDM config file executes a single pqact process with the -e flag (for EDEX). The edexBridge server name is defined (typically the local machine name): EXEC \"pqact -e /awips2/ldm/etc/pqact.conf\" EXEC \"edexBridge -s $HOSTNAME\" A separate EXEC line should be added for GEMPAK decoders, without the -e flag: EXEC \"pqact -f IDS-DDPLUS /awips2/ldm/etc/pqact.gempak_decoders\" yum install apr-devel apr-util-devel \"libdb-4.7.so()(64bit)\"","title":"/awips2/ldm/etc/ldmd.conf"},{"location":"edex/ldm/","text":"LDM Feeds \uf0c1 Default LDM Feeds for EDEX \uf0c1 Data feeds are defined by the ldmd.conf file in /awips2/ldm/etc/ldmd.conf . The default feeds that come \"turned on\" with our EDEX are the following: REQUEST FNEXRAD \".*\" idd.unidata.ucar.edu # MRMS - Unidata feed via NCEP REQUEST NEXRAD3 \".*\" idd.unidata.ucar.edu # Radar Level3 REQUEST HDS \"^SDUS6.*\" idd.unidata.ucar.edu # Radar Level3 - specific files REQUEST WMO \".*\" idd.unidata.ucar.edu # WMO Feedtype includes HDS|IDS|DDPLUS REQUEST UNIWISC|NIMAGE \".*\" idd.unidata.ucar.edu # AREA/GINI and GOES Products REQUEST DIFAX \"GLM\" idd.unidata.ucar.edu # GOES GLM Gridded Product (Texas Tech-Eric Bruning) REQUEST NOTHER \"^TI[A-W]... KNES\" idd.unidata.ucar.edu # VIIRS and GOES CMI via SBN REQUEST NOTHER \"^IXT[WXY]01\" idd.unidata.ucar.edu #Special SBN GOES Derived products-different WMO (COD, CPS, CTP) REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST CONDUIT \"nam\" idd.unidata.ucar.edu # NAM12 REQUEST CONDUIT \"pgrb2\" idd.unidata.ucar.edu # GFS Optional LDM Feeds \uf0c1 Some additional feeds are included but commented out using '#'. To activate the feed, simply remove the #, save the file, and restart the LDM . FNMOC and CMC models \uf0c1 REQUEST FNMOC \".*\" idd.unidata.ucar.edu REQUEST CMC \".*\" idd.unidata.ucar.edu Lightning (restricted to educational use with rebroadcasting restricted) \uf0c1 REQUEST LIGHTNING \".*\" striker2.atmos.albany.edu REQUEST LIGHTNING \".*\" idd.unidata.ucar.edu FSL/GSD Experimental HRRR (Sub-hourly) \uf0c1 REQUEST FSL2 \"^GRIB2.FSL.HRRR\" hrrr.unidata.ucar.edu Restart the LDM \uf0c1 Use the following commands to restart the LDM: sudo service edex_ldm restart ldmadmin restart Monitor Incoming Data Feeds \uf0c1 To watch incoming data in real-time: notifyme -vl - To watch for a specific product and feed and time (360 sec = 6 min): notifyme -vl - -h localhost -f NEXRAD3 -p DHR -o 360 To watch the same on a remote queue: notifyme -vl - -h idd.unidata.ucar.edu -f NEXRAD3 -p DHR -o 360 LDM Logging \uf0c1 To open a real-time readout of LDM logging you can run use the edex command. To exit, press CTRL+C . edex log ldm [edex] EDEX Log Viewer :: Viewing /awips2/ldm/logs/ldmd.log. Press CTRL+C to exit Aug 26 15:05:10 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_MXUPHL01-21387192.grib2\": 406227 20160826210510.477 NGRID 21387192 YZCG86 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/MXUPHL01/5000-2000 m HGHT Aug 26 15:05:11 edextest edexBridge[5812] NOTE: Sent 2 messages (0 at the end of the queue, 2 normally). Aug 26 15:05:11 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_CICEP-21387200.grib2\": 369464 20160826210511.484 NGRID 21387200 YMCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/CICEP/0 - NONE Aug 26 15:05:12 edextest edexBridge[5812] NOTE: Sent 9 messages (0 at the end of the queue, 9 normally). Aug 26 15:05:12 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_LTNG-21387205.grib2\": 482800 20160826210512.254 NGRID 21387205 YZCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/LTNG/0 - EATM Aug 26 15:05:13 edextest edexBridge[5812] NOTE: Sent 1 messages (0 at the end of the queue, 1 normally).","title":"LDM Feeds"},{"location":"edex/ldm/#ldm-feeds","text":"","title":"LDM Feeds"},{"location":"edex/ldm/#default-ldm-feeds-for-edex","text":"Data feeds are defined by the ldmd.conf file in /awips2/ldm/etc/ldmd.conf . The default feeds that come \"turned on\" with our EDEX are the following: REQUEST FNEXRAD \".*\" idd.unidata.ucar.edu # MRMS - Unidata feed via NCEP REQUEST NEXRAD3 \".*\" idd.unidata.ucar.edu # Radar Level3 REQUEST HDS \"^SDUS6.*\" idd.unidata.ucar.edu # Radar Level3 - specific files REQUEST WMO \".*\" idd.unidata.ucar.edu # WMO Feedtype includes HDS|IDS|DDPLUS REQUEST UNIWISC|NIMAGE \".*\" idd.unidata.ucar.edu # AREA/GINI and GOES Products REQUEST DIFAX \"GLM\" idd.unidata.ucar.edu # GOES GLM Gridded Product (Texas Tech-Eric Bruning) REQUEST NOTHER \"^TI[A-W]... KNES\" idd.unidata.ucar.edu # VIIRS and GOES CMI via SBN REQUEST NOTHER \"^IXT[WXY]01\" idd.unidata.ucar.edu #Special SBN GOES Derived products-different WMO (COD, CPS, CTP) REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST CONDUIT \"nam\" idd.unidata.ucar.edu # NAM12 REQUEST CONDUIT \"pgrb2\" idd.unidata.ucar.edu # GFS","title":"Default LDM Feeds for EDEX"},{"location":"edex/ldm/#optional-ldm-feeds","text":"Some additional feeds are included but commented out using '#'. To activate the feed, simply remove the #, save the file, and restart the LDM .","title":"Optional LDM Feeds"},{"location":"edex/ldm/#fnmoc-and-cmc-models","text":"REQUEST FNMOC \".*\" idd.unidata.ucar.edu REQUEST CMC \".*\" idd.unidata.ucar.edu","title":"FNMOC and CMC models"},{"location":"edex/ldm/#lightning-restricted-to-educational-use-with-rebroadcasting-restricted","text":"REQUEST LIGHTNING \".*\" striker2.atmos.albany.edu REQUEST LIGHTNING \".*\" idd.unidata.ucar.edu","title":"Lightning (restricted to educational use with rebroadcasting restricted)"},{"location":"edex/ldm/#fslgsd-experimental-hrrr-sub-hourly","text":"REQUEST FSL2 \"^GRIB2.FSL.HRRR\" hrrr.unidata.ucar.edu","title":"FSL/GSD Experimental HRRR (Sub-hourly)"},{"location":"edex/ldm/#restart-the-ldm","text":"Use the following commands to restart the LDM: sudo service edex_ldm restart ldmadmin restart","title":"Restart the LDM"},{"location":"edex/ldm/#monitor-incoming-data-feeds","text":"To watch incoming data in real-time: notifyme -vl - To watch for a specific product and feed and time (360 sec = 6 min): notifyme -vl - -h localhost -f NEXRAD3 -p DHR -o 360 To watch the same on a remote queue: notifyme -vl - -h idd.unidata.ucar.edu -f NEXRAD3 -p DHR -o 360","title":"Monitor Incoming Data Feeds"},{"location":"edex/ldm/#ldm-logging","text":"To open a real-time readout of LDM logging you can run use the edex command. To exit, press CTRL+C . edex log ldm [edex] EDEX Log Viewer :: Viewing /awips2/ldm/logs/ldmd.log. Press CTRL+C to exit Aug 26 15:05:10 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_MXUPHL01-21387192.grib2\": 406227 20160826210510.477 NGRID 21387192 YZCG86 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/MXUPHL01/5000-2000 m HGHT Aug 26 15:05:11 edextest edexBridge[5812] NOTE: Sent 2 messages (0 at the end of the queue, 2 normally). Aug 26 15:05:11 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_CICEP-21387200.grib2\": 369464 20160826210511.484 NGRID 21387200 YMCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/CICEP/0 - NONE Aug 26 15:05:12 edextest edexBridge[5812] NOTE: Sent 9 messages (0 at the end of the queue, 9 normally). Aug 26 15:05:12 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_LTNG-21387205.grib2\": 482800 20160826210512.254 NGRID 21387205 YZCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/LTNG/0 - EATM Aug 26 15:05:13 edextest edexBridge[5812] NOTE: Sent 1 messages (0 at the end of the queue, 1 normally).","title":"LDM Logging"},{"location":"edex/linux-tools/","text":"Using Standard Linux Tools \uf0c1 Several standard Linux tools can be used to monitor the EDEX processes, and for the purposes of this document and the Unidata AWIPS Training Workshop, it is assumed that all are available and that the user has some knowledge of how they are used. Regardless, this document includes the full command syntax that can be copy and pasted from the document to the terminal. ps - Display information about specific processes ps aux | grep edex cat - Used to display a text file in a terminal cat /awips2/ldm/etc/pqact.conf tail - Used to provide a dynamic picture of process logs tail -f /awips2/ldm/logs/ldmd.conf grep - Used to filter content of process logs; used to filter output of other tools grep edexBridge /awips2/ldm/etc/ldmd.conf top - Provides a dynamic view of the memory and cpu usage of the EDEX processes psql - A terminal-based front-end to PostgreSQL. We will be executing SQL queries. You do not need to have previous experience with SQL to follow this guide, but navigating AWIPS metadata is made much easier with some experience. [awips@edex ~]$ psql metadata psql (9.2.4) Type \"help\" for help. metadata=# help You are using psql, the command-line interface to PostgreSQL. Type: \\copyright for distribution terms \\h for help with SQL commands \\? for help with psql commands \\g or terminate with semicolon to execute query \\q to quit metadata=# \\dt sat* List of relations Schema | Name | Type | Owner --------+-----------------------------------+-------+------- awips | satellite | table | awips awips | satellite_creating_entities | table | awips awips | satellite_geostationary_positions | table | awips awips | satellite_physical_elements | table | awips awips | satellite_sector_ids | table | awips awips | satellite_sources | table | awips awips | satellite_spatial | table | awips awips | satellite_units | table | awips (8 rows) metadata=# \\q","title":"Linux tools"},{"location":"edex/linux-tools/#using-standard-linux-tools","text":"Several standard Linux tools can be used to monitor the EDEX processes, and for the purposes of this document and the Unidata AWIPS Training Workshop, it is assumed that all are available and that the user has some knowledge of how they are used. Regardless, this document includes the full command syntax that can be copy and pasted from the document to the terminal. ps - Display information about specific processes ps aux | grep edex cat - Used to display a text file in a terminal cat /awips2/ldm/etc/pqact.conf tail - Used to provide a dynamic picture of process logs tail -f /awips2/ldm/logs/ldmd.conf grep - Used to filter content of process logs; used to filter output of other tools grep edexBridge /awips2/ldm/etc/ldmd.conf top - Provides a dynamic view of the memory and cpu usage of the EDEX processes psql - A terminal-based front-end to PostgreSQL. We will be executing SQL queries. You do not need to have previous experience with SQL to follow this guide, but navigating AWIPS metadata is made much easier with some experience. [awips@edex ~]$ psql metadata psql (9.2.4) Type \"help\" for help. metadata=# help You are using psql, the command-line interface to PostgreSQL. Type: \\copyright for distribution terms \\h for help with SQL commands \\? for help with psql commands \\g or terminate with semicolon to execute query \\q to quit metadata=# \\dt sat* List of relations Schema | Name | Type | Owner --------+-----------------------------------+-------+------- awips | satellite | table | awips awips | satellite_creating_entities | table | awips awips | satellite_geostationary_positions | table | awips awips | satellite_physical_elements | table | awips awips | satellite_sector_ids | table | awips awips | satellite_sources | table | awips awips | satellite_spatial | table | awips awips | satellite_units | table | awips (8 rows) metadata=# \\q","title":"Using Standard Linux Tools"},{"location":"edex/new-grid-grib1-old/","text":"Ingest a New Grid Using .grib Files \uf0c1 Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table Ingest an Unsupported Grid \uf0c1 Download Test Data \uf0c1 Download an example grib1 file and rename to a *.grib extension, then copy to the manual ingest point /awips2/data_store/ingest/ wget https://downloads.unidata.ucar.edu/awips2/current/files/14102318_nmm_d01.GrbF00600 -O wrf.grib cp wrf.grib /awips2/data_store/ingest/ Remember that the data distribution file ( /awips2/edex/data/utility/common_static/base/distribution/grib.xml ) will match filenames which have the *.grib* extension. Check Grib Logs \uf0c1 Confirm that the grib file decodes in the grib log file: edex log grib INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.2240 (sec) Latency: 21.9140 (sec) ... Check HDF5 Data \uf0c1 Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:7:0:89 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (7, 0, 89, respectively). Determine Grid Projection \uf0c1 When the grid was ingested a record was added to the grid_coverage table with its navigation information: psql metadata metadata=# select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2 from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:7:0:89'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 -----+-----+------------------+------------------+-----------+-----------+------------------+-------------------+-------------------+------------------+------------------ 201 | 155 | 4.29699993133545 | 4.29699993133545 | 6378160 | 6356775 | 42.2830009460449 | -72.3610000610352 | -67.0770034790039 | 45.3680000305176 | 45.3680000305176 (1 row) Compare with the projection info returned by wgrib on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 Notice that our grib file has a Lambert Conformal projection. We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage area. Create Grid Projection File \uf0c1 Projection Types \uf0c1 Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) 305 Regional - CONUS (Lambert Conformal) 16.322 -125.955 LowerLeft 151 113 40.63525 40.63525 km 6356775.0 6378160.0 -95.0 25.0 25.0 polarStereoGridCoverage (example seaice_south1_grid.xml ) 405 Sea Ice south 690X710 13km grid -36.866 139.806 LowerLeft 690 710 12.7 12.7 km 6371229.0 6371229.0 100.0 latLonGridCoverage (example UkmetHR-SHemisphere.xml ) 864162002 UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E -89.721 71.25 LowerLeft 864 162 0.833 0.556 degree -0.278 70.416 mercatorGridCoverage (example gridNBM_PR.xml ) NBM_PR National Blend Grid over Puerto Rico - (1.25 km) 16.9775 -68.0278 LowerLeft 339 225 1.25 1.25 19.3750032477232 -63.984399999999994 20 km 6371200 6371200 Creating a New Projection File \uf0c1 Copy an existing xml file with the same grid projection type (in this case lambertConformalGridCoverage ) to a new file wrf.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp RUCIcing.xml wrf.xml And edit the new wrf.xml to define the projection values using the output from wgrib or the database (example provided): vi wrf.xml 201155 Regional - CONUS (Lambert Conformal) 42.2830009460449 -72.3610000610352 LowerLeft 201 155 4.29699993133545 4.29699993133545 km 6356775.0 6378160.0 -67.0770034790039 45.3680000305176 45.3680000305176 Notice the 201155 tag was created by using the number of grid points (201 and 155). This name can be anything as long as it is unique and will be used to match against in the model definition. Create Model Definition \uf0c1 Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib file has a center ID of 7 (NCEP) we will edit the gribModels_NCEP-7.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NCEP-7.xml In add an entry: WRF 7 0 201155 89 Save the file and restart EDEX for the changes to take effect: sudo service edex_camel restart ingestGrib Now copy the wrf.grib file again to /awips2/data_store/ingest/ . If everything is correct we will not see any persistence errors since the grid is now named WRF and not GribModel:7:0:89 . cp wrf.grib /awips2/data_store/ingest/ edex log grib After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid . Adding a Table \uf0c1 If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . The center and subcenter have been identified previously here , as 7 and 0, respectively. So, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/ To find the discipline of a grib product, you need the process and table values from the grib file. These are output with the wgrib -V command: wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 < LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 For our example, the process is 89 and table is 2 . Next, take a look in: /awips2/edex/data/utility/common_static/base/grid/grib1ParameterConvTable.xml And find the entry that has grib1 data with TableVersion 2 and Value 89: 7 2 89 0 3 10 Here, we can see the discipline and category values (referred to as x above) are 0 and 3, respectively. So, the table needed for our example file is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/4.2.0.3.table","title":"Ingest a New Grid Using .grib Files"},{"location":"edex/new-grid-grib1-old/#ingest-a-new-grid-using-grib-files","text":"Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table","title":"Ingest a New Grid Using .grib Files"},{"location":"edex/new-grid-grib1-old/#ingest-an-unsupported-grid","text":"","title":"Ingest an Unsupported Grid"},{"location":"edex/new-grid-grib1-old/#download-test-data","text":"Download an example grib1 file and rename to a *.grib extension, then copy to the manual ingest point /awips2/data_store/ingest/ wget https://downloads.unidata.ucar.edu/awips2/current/files/14102318_nmm_d01.GrbF00600 -O wrf.grib cp wrf.grib /awips2/data_store/ingest/ Remember that the data distribution file ( /awips2/edex/data/utility/common_static/base/distribution/grib.xml ) will match filenames which have the *.grib* extension.","title":"Download Test Data"},{"location":"edex/new-grid-grib1-old/#check-grib-logs","text":"Confirm that the grib file decodes in the grib log file: edex log grib INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.2240 (sec) Latency: 21.9140 (sec) ...","title":"Check Grib Logs"},{"location":"edex/new-grid-grib1-old/#check-hdf5-data","text":"Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:7:0:89 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (7, 0, 89, respectively).","title":"Check HDF5 Data"},{"location":"edex/new-grid-grib1-old/#determine-grid-projection","text":"When the grid was ingested a record was added to the grid_coverage table with its navigation information: psql metadata metadata=# select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2 from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:7:0:89'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 -----+-----+------------------+------------------+-----------+-----------+------------------+-------------------+-------------------+------------------+------------------ 201 | 155 | 4.29699993133545 | 4.29699993133545 | 6378160 | 6356775 | 42.2830009460449 | -72.3610000610352 | -67.0770034790039 | 45.3680000305176 | 45.3680000305176 (1 row) Compare with the projection info returned by wgrib on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 Notice that our grib file has a Lambert Conformal projection. We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage area.","title":"Determine Grid Projection"},{"location":"edex/new-grid-grib1-old/#create-grid-projection-file","text":"","title":"Create Grid Projection File"},{"location":"edex/new-grid-grib1-old/#projection-types","text":"Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) 305 Regional - CONUS (Lambert Conformal) 16.322 -125.955 LowerLeft 151 113 40.63525 40.63525 km 6356775.0 6378160.0 -95.0 25.0 25.0 polarStereoGridCoverage (example seaice_south1_grid.xml ) 405 Sea Ice south 690X710 13km grid -36.866 139.806 LowerLeft 690 710 12.7 12.7 km 6371229.0 6371229.0 100.0 latLonGridCoverage (example UkmetHR-SHemisphere.xml ) 864162002 UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E -89.721 71.25 LowerLeft 864 162 0.833 0.556 degree -0.278 70.416 mercatorGridCoverage (example gridNBM_PR.xml ) NBM_PR National Blend Grid over Puerto Rico - (1.25 km) 16.9775 -68.0278 LowerLeft 339 225 1.25 1.25 19.3750032477232 -63.984399999999994 20 km 6371200 6371200 ","title":"Projection Types"},{"location":"edex/new-grid-grib1-old/#creating-a-new-projection-file","text":"Copy an existing xml file with the same grid projection type (in this case lambertConformalGridCoverage ) to a new file wrf.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp RUCIcing.xml wrf.xml And edit the new wrf.xml to define the projection values using the output from wgrib or the database (example provided): vi wrf.xml 201155 Regional - CONUS (Lambert Conformal) 42.2830009460449 -72.3610000610352 LowerLeft 201 155 4.29699993133545 4.29699993133545 km 6356775.0 6378160.0 -67.0770034790039 45.3680000305176 45.3680000305176 Notice the 201155 tag was created by using the number of grid points (201 and 155). This name can be anything as long as it is unique and will be used to match against in the model definition.","title":"Creating a New Projection File"},{"location":"edex/new-grid-grib1-old/#create-model-definition","text":"Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib file has a center ID of 7 (NCEP) we will edit the gribModels_NCEP-7.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NCEP-7.xml In add an entry: WRF 7 0 201155 89 Save the file and restart EDEX for the changes to take effect: sudo service edex_camel restart ingestGrib Now copy the wrf.grib file again to /awips2/data_store/ingest/ . If everything is correct we will not see any persistence errors since the grid is now named WRF and not GribModel:7:0:89 . cp wrf.grib /awips2/data_store/ingest/ edex log grib After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid .","title":"Create Model Definition"},{"location":"edex/new-grid-grib1-old/#adding-a-table","text":"If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . The center and subcenter have been identified previously here , as 7 and 0, respectively. So, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/ To find the discipline of a grib product, you need the process and table values from the grib file. These are output with the wgrib -V command: wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 < LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 For our example, the process is 89 and table is 2 . Next, take a look in: /awips2/edex/data/utility/common_static/base/grid/grib1ParameterConvTable.xml And find the entry that has grib1 data with TableVersion 2 and Value 89: 7 2 89 0 3 10 Here, we can see the discipline and category values (referred to as x above) are 0 and 3, respectively. So, the table needed for our example file is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/4.2.0.3.table","title":"Adding a Table"},{"location":"edex/new-grid/","text":"Ingest a New Grid \uf0c1 Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ This page explains how to ingest .grib2 products. To view information about .grib products, please see this page . To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table Ingest an Unsupported Grid \uf0c1 Download Test Data \uf0c1 Download an example grib2 file (make sure the extension is .grib2 or the EDEX distribution file may not recognize it), and then copy to the manual ingest point /awips2/data_store/ingest/ : wget https://downloads.unidata.ucar.edu/awips2/current/files/CPTI_00.50_20180502-000144.grib2 -O cpti.grib2 cp cpti.grib2 /awips2/data_store/ingest/ Check Grib Logs \uf0c1 Confirm that the grib file decodes in the grib log file. Look in the current log file (/awips2/edex/logs/edex-ingestGrib-[YYYYMMDD].log) for the following: INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.2240 (sec) Latency: 21.9140 (sec) ... This step will fail for our example because the parameter is not yet defined. The error will look like: INFO 2020-07-20 20:34:17,710 2565 [GribPersist-1] GridDao: EDEX - Discarding record due to missing or unknown parameter mapping: /grid/2018-05-02_00:01:44.0_(0)/GribModel:161:0:97/null/null/403/Missing/FH/500.0/-999999.0 INFO 2020-07-20 20:34:17,710 2566 [GribPersist-1] Ingest: EDEX: Ingest - grib2:: /awips2/data_store/ingest/CPTI_00.50_20180502-000144.grib2 processed in: 2.3550 (sec) INFO 2020-07-20 20:34:17,827 2567 [Ingest.GribDecode-6] grib: EDEX - No parameter information for center[161], subcenter[0], tableName[4.2.209.3], parameter value[61] In order to successfully ingest the example file, you must define the appropriate table . Check HDF5 Data \uf0c1 Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:161:0:97 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (161, 0, 97, respectively). Determine Grid Projection \uf0c1 When a grid is ingested a record is added to the grid_coverage table with its navigation information: psql metadata metadata=> select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2,spacingunit,lad,la2,latin,lo2,firstgridpointcorner from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:161:0:97'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 | spacingunit | lad | la2 | latin | lo2 | firstgridpointcorner -----+-----+-------+-------+-----------+-----------+-----------+-----+-----+--------+--------+-------------+-----+-----+-------+-----+---------------------- 600 | 640 | 0.005 | 0.005 | | | 40.799999 | 261 | | | | degree | | | | | UpperLeft (1 row) Compare with the projection info returned by wgrib2 on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib2 -grid -nxny cpti.grib2 1:0:grid_template=0:winds(N/S): lat-lon grid:(600 x 640) units 1e-06 input WE:NS output WE:SN res 48 lat 40.799999 to 37.599999 by 0.005000 lon 260.999999 to 263.999999 by 0.005000 #points=384000:(600 x 640) ... Notice that our grib2 file has a Lat/lon Grid projection, that starts in the UpperLeft corner (as defined by input West to East, North to South). Where: nx is 600 ny is 640 dx is 0.005 dy is 0.005 la1 is 40.799999 lo1 is 261 We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage (la1 and lo1) area. Create Grid Projection File \uf0c1 Projection Types \uf0c1 You may not have information for every tag listed, for example it's not required for the latLonGridCoverage to have spacingUnit, la2, lo2. Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) 305 Regional - CONUS (Lambert Conformal) 16.322 -125.955 LowerLeft 151 113 40.63525 40.63525 km 6356775.0 6378160.0 -95.0 25.0 25.0 polarStereoGridCoverage (example seaice_south1_grid.xml ) 405 Sea Ice south 690X710 13km grid -36.866 139.806 LowerLeft 690 710 12.7 12.7 km 6371229.0 6371229.0 100.0 latLonGridCoverage (example UkmetHR-SHemisphere.xml ) 864162002 UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E -89.721 71.25 LowerLeft 864 162 0.833 0.556 degree -0.278 70.416 mercatorGridCoverage (example gridNBM_PR.xml ) NBM_PR National Blend Grid over Puerto Rico - (1.25 km) 16.9775 -68.0278 LowerLeft 339 225 1.25 1.25 19.3750032477232 -63.984399999999994 20 km 6371200 6371200 Creating a New Projection File \uf0c1 Copy an existing xml file with the same grid projection type (in this case latLonGridCoverage ) to a new file cpti.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp MRMS-1km-CONUS.xml cpti.xml And edit the new cpti.xml to define the projection values using the output from wgrib2 or the database (example provided): vi cpti.xml 600640 Small domain for CPTI products 40.799999 261 UpperLeft 600 640 0.005 0.005 degree Notice the 600640 tag was created by using the number of grid points (600 and 640). This name can be anything as long as it is unique and will be used to match against in the model definition. Create Model Definition \uf0c1 Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib2 file has a center of 161 (NOAA) we will edit the gribModels_NOAA-161.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NOAA-161.xml In , under the <-- Subcenter 0 --> comment, add an entry: CPTI 161 0 600640 97 Save the model file and restart edex: sudo service edex_camel restart ingestGrib Now if you drop cpti.grib2 into the manual endpoint again, it should ingest without any persistence errors. Adding a Table \uf0c1 If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . There are also default parameters that all grib products may access located in this directory: /awips2/edex/data/utility/common_static/base/grib/tables/-1/-1/ If you are using a grib2 file, then you can use either the log output or the -center , -subcenter , and -full_name options on wgrib2 to get the center, subcenter, discipline, category, and parameter information: The table would be found in the directory structure using this file's center and subcenter. Finding Center \uf0c1 The center can be found by either: Running the following command: wgrib2 -center cpti.grib2 1:0:center=US NOAA Office of Oceanic and Atmospheric Research ... And then looking up the corresponding value for \"US NOAA Office of Oceanic and Atmospheric Research\" at this website , where it happens to be 161 . OR: Running the following command: wgrib2 -varX cpti.grib2 1:0:var209_255_1_ 161 _3_61 ... Where the 4th argument after \"var\" is the center id, in this case 161 . Finding Subcenter \uf0c1 To get the subcenter, simply run: wgrib2 -subcenter cpti.grib2 1:0:subcenter= 0 ... The subcenter of this file is 0 . Based on the center and subcenter, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/ Finding Discipline and Category \uf0c1 To find the exact table, we need the discipline and category: wgrib2 -full_name cpti.grib2 1:0:var 209 _ 3 _ 61 .500_m_above_mean_sea_level ... In this case the discipline is 209 and category is 3 , so the corresponding table is: 4.2.209.3.table Corresponding Table \uf0c1 The full path to the corresponding table would be: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/4.2.209.3.table The parameter ID was also listed in that output as 61 . Make sure that specific parameter information is defined in the table: ... 56:56:Reflectivity at -20C:dBZ:ReflectivityM20C 57:57:Reflectivity At Lowest Altitude (RALA):dBZ:ReflectivityAtLowestAltitude 58:58:Merged Reflectivity At Lowest Altitude (RALA):dBZ:MergedReflectivityAtLowestAltitude 59:59:CPTI 80mph+:%:CPTI80mph 61:61:CPTI 110mph+:%:CPTI110mph You will have to restart ingestGrib for the changes to take place: sudo service edex_camel restart ingestGrib Now you can try re-ingesting the grib2 file . Creating Menu Items \uf0c1 After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid . Implementing a Production Process \uf0c1 The ingest method mentioned earlier is strictly meant to only be used during testing and development of ingesting new grid data. The reasoning is because the manual end point is very inefficent during its ingest. It creates copies of the data file and uses more resources than you'd want in a production process. Once you are satisfied with the data ingest and display in CAVE, then we highly recommend you implement a production process for ingest that does not involve the manual directory ( /awips2/data_store/ingest/ ). The recommended way is to make use of a Python script we distribute with AWIPS (EDEX). This script is called notifyAWIPS2-unidata.py and located in th /awips2/ldm/dev/ directory. If you are already using a script to manually gather the data, then adding an additional call like the one below, should ingest your data to EDEX in an efficient manner: /awips2/ldm/dev/notifyAWIPS2-unidata.py [path-to-new-grib-file] Make sure the python script is executable. To do this you may have to run chmod +x /awips2/ldm/dev/notifyAWIPS2-unidata.py Using wgrib2 \uf0c1 Mentioned in this page are a few command parameters for wgrib2 such as -grid , varX , -center , -subcenter , and -full_name . A complete list of all available parameters can be found here . Troubleshooting Grib Ingest \uf0c1 Make sure the latitude and longitude entries in your coverage specification file match those of your ingested raw grib file. There is a tolerance of +/- 0.1 degree to keep in mind when defining your coverage area. If some of the information is unknown, using a grib utility application such as wgrib and wgrib2 can be useful in determining the information that must be added to correctly process a new grib file. If you are experiencing Segmentation fault errors when running wgrib2, it may be best to install the latest version using the following command: yum install wgrib2 And then you may either need to change where wgrib2 points to, or use /bin/wgrib2 to run the recently downloaded version.","title":"Ingest a New Grid"},{"location":"edex/new-grid/#ingest-a-new-grid","text":"Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ This page explains how to ingest .grib2 products. To view information about .grib products, please see this page . To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table","title":"Ingest a New Grid"},{"location":"edex/new-grid/#ingest-an-unsupported-grid","text":"","title":"Ingest an Unsupported Grid"},{"location":"edex/new-grid/#download-test-data","text":"Download an example grib2 file (make sure the extension is .grib2 or the EDEX distribution file may not recognize it), and then copy to the manual ingest point /awips2/data_store/ingest/ : wget https://downloads.unidata.ucar.edu/awips2/current/files/CPTI_00.50_20180502-000144.grib2 -O cpti.grib2 cp cpti.grib2 /awips2/data_store/ingest/","title":"Download Test Data"},{"location":"edex/new-grid/#check-grib-logs","text":"Confirm that the grib file decodes in the grib log file. Look in the current log file (/awips2/edex/logs/edex-ingestGrib-[YYYYMMDD].log) for the following: INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.2240 (sec) Latency: 21.9140 (sec) ... This step will fail for our example because the parameter is not yet defined. The error will look like: INFO 2020-07-20 20:34:17,710 2565 [GribPersist-1] GridDao: EDEX - Discarding record due to missing or unknown parameter mapping: /grid/2018-05-02_00:01:44.0_(0)/GribModel:161:0:97/null/null/403/Missing/FH/500.0/-999999.0 INFO 2020-07-20 20:34:17,710 2566 [GribPersist-1] Ingest: EDEX: Ingest - grib2:: /awips2/data_store/ingest/CPTI_00.50_20180502-000144.grib2 processed in: 2.3550 (sec) INFO 2020-07-20 20:34:17,827 2567 [Ingest.GribDecode-6] grib: EDEX - No parameter information for center[161], subcenter[0], tableName[4.2.209.3], parameter value[61] In order to successfully ingest the example file, you must define the appropriate table .","title":"Check Grib Logs"},{"location":"edex/new-grid/#check-hdf5-data","text":"Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:161:0:97 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (161, 0, 97, respectively).","title":"Check HDF5 Data"},{"location":"edex/new-grid/#determine-grid-projection","text":"When a grid is ingested a record is added to the grid_coverage table with its navigation information: psql metadata metadata=> select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2,spacingunit,lad,la2,latin,lo2,firstgridpointcorner from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:161:0:97'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 | spacingunit | lad | la2 | latin | lo2 | firstgridpointcorner -----+-----+-------+-------+-----------+-----------+-----------+-----+-----+--------+--------+-------------+-----+-----+-------+-----+---------------------- 600 | 640 | 0.005 | 0.005 | | | 40.799999 | 261 | | | | degree | | | | | UpperLeft (1 row) Compare with the projection info returned by wgrib2 on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib2 -grid -nxny cpti.grib2 1:0:grid_template=0:winds(N/S): lat-lon grid:(600 x 640) units 1e-06 input WE:NS output WE:SN res 48 lat 40.799999 to 37.599999 by 0.005000 lon 260.999999 to 263.999999 by 0.005000 #points=384000:(600 x 640) ... Notice that our grib2 file has a Lat/lon Grid projection, that starts in the UpperLeft corner (as defined by input West to East, North to South). Where: nx is 600 ny is 640 dx is 0.005 dy is 0.005 la1 is 40.799999 lo1 is 261 We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage (la1 and lo1) area.","title":"Determine Grid Projection"},{"location":"edex/new-grid/#create-grid-projection-file","text":"","title":"Create Grid Projection File"},{"location":"edex/new-grid/#projection-types","text":"You may not have information for every tag listed, for example it's not required for the latLonGridCoverage to have spacingUnit, la2, lo2. Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) 305 Regional - CONUS (Lambert Conformal) 16.322 -125.955 LowerLeft 151 113 40.63525 40.63525 km 6356775.0 6378160.0 -95.0 25.0 25.0 polarStereoGridCoverage (example seaice_south1_grid.xml ) 405 Sea Ice south 690X710 13km grid -36.866 139.806 LowerLeft 690 710 12.7 12.7 km 6371229.0 6371229.0 100.0 latLonGridCoverage (example UkmetHR-SHemisphere.xml ) 864162002 UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E -89.721 71.25 LowerLeft 864 162 0.833 0.556 degree -0.278 70.416 mercatorGridCoverage (example gridNBM_PR.xml ) NBM_PR National Blend Grid over Puerto Rico - (1.25 km) 16.9775 -68.0278 LowerLeft 339 225 1.25 1.25 19.3750032477232 -63.984399999999994 20 km 6371200 6371200 ","title":"Projection Types"},{"location":"edex/new-grid/#creating-a-new-projection-file","text":"Copy an existing xml file with the same grid projection type (in this case latLonGridCoverage ) to a new file cpti.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp MRMS-1km-CONUS.xml cpti.xml And edit the new cpti.xml to define the projection values using the output from wgrib2 or the database (example provided): vi cpti.xml 600640 Small domain for CPTI products 40.799999 261 UpperLeft 600 640 0.005 0.005 degree Notice the 600640 tag was created by using the number of grid points (600 and 640). This name can be anything as long as it is unique and will be used to match against in the model definition.","title":"Creating a New Projection File"},{"location":"edex/new-grid/#create-model-definition","text":"Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib2 file has a center of 161 (NOAA) we will edit the gribModels_NOAA-161.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NOAA-161.xml In , under the <-- Subcenter 0 --> comment, add an entry: CPTI 161 0 600640 97 Save the model file and restart edex: sudo service edex_camel restart ingestGrib Now if you drop cpti.grib2 into the manual endpoint again, it should ingest without any persistence errors.","title":"Create Model Definition"},{"location":"edex/new-grid/#adding-a-table","text":"If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . There are also default parameters that all grib products may access located in this directory: /awips2/edex/data/utility/common_static/base/grib/tables/-1/-1/ If you are using a grib2 file, then you can use either the log output or the -center , -subcenter , and -full_name options on wgrib2 to get the center, subcenter, discipline, category, and parameter information: The table would be found in the directory structure using this file's center and subcenter.","title":"Adding a Table"},{"location":"edex/new-grid/#finding-center","text":"The center can be found by either: Running the following command: wgrib2 -center cpti.grib2 1:0:center=US NOAA Office of Oceanic and Atmospheric Research ... And then looking up the corresponding value for \"US NOAA Office of Oceanic and Atmospheric Research\" at this website , where it happens to be 161 . OR: Running the following command: wgrib2 -varX cpti.grib2 1:0:var209_255_1_ 161 _3_61 ... Where the 4th argument after \"var\" is the center id, in this case 161 .","title":"Finding Center"},{"location":"edex/new-grid/#finding-subcenter","text":"To get the subcenter, simply run: wgrib2 -subcenter cpti.grib2 1:0:subcenter= 0 ... The subcenter of this file is 0 . Based on the center and subcenter, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/","title":"Finding Subcenter"},{"location":"edex/new-grid/#finding-discipline-and-category","text":"To find the exact table, we need the discipline and category: wgrib2 -full_name cpti.grib2 1:0:var 209 _ 3 _ 61 .500_m_above_mean_sea_level ... In this case the discipline is 209 and category is 3 , so the corresponding table is: 4.2.209.3.table","title":"Finding Discipline and Category"},{"location":"edex/new-grid/#corresponding-table","text":"The full path to the corresponding table would be: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/4.2.209.3.table The parameter ID was also listed in that output as 61 . Make sure that specific parameter information is defined in the table: ... 56:56:Reflectivity at -20C:dBZ:ReflectivityM20C 57:57:Reflectivity At Lowest Altitude (RALA):dBZ:ReflectivityAtLowestAltitude 58:58:Merged Reflectivity At Lowest Altitude (RALA):dBZ:MergedReflectivityAtLowestAltitude 59:59:CPTI 80mph+:%:CPTI80mph 61:61:CPTI 110mph+:%:CPTI110mph You will have to restart ingestGrib for the changes to take place: sudo service edex_camel restart ingestGrib Now you can try re-ingesting the grib2 file .","title":"Corresponding Table"},{"location":"edex/new-grid/#creating-menu-items","text":"After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid .","title":"Creating Menu Items"},{"location":"edex/new-grid/#implementing-a-production-process","text":"The ingest method mentioned earlier is strictly meant to only be used during testing and development of ingesting new grid data. The reasoning is because the manual end point is very inefficent during its ingest. It creates copies of the data file and uses more resources than you'd want in a production process. Once you are satisfied with the data ingest and display in CAVE, then we highly recommend you implement a production process for ingest that does not involve the manual directory ( /awips2/data_store/ingest/ ). The recommended way is to make use of a Python script we distribute with AWIPS (EDEX). This script is called notifyAWIPS2-unidata.py and located in th /awips2/ldm/dev/ directory. If you are already using a script to manually gather the data, then adding an additional call like the one below, should ingest your data to EDEX in an efficient manner: /awips2/ldm/dev/notifyAWIPS2-unidata.py [path-to-new-grib-file] Make sure the python script is executable. To do this you may have to run chmod +x /awips2/ldm/dev/notifyAWIPS2-unidata.py","title":"Implementing a Production Process"},{"location":"edex/new-grid/#using-wgrib2","text":"Mentioned in this page are a few command parameters for wgrib2 such as -grid , varX , -center , -subcenter , and -full_name . A complete list of all available parameters can be found here .","title":"Using wgrib2"},{"location":"edex/new-grid/#troubleshooting-grib-ingest","text":"Make sure the latitude and longitude entries in your coverage specification file match those of your ingested raw grib file. There is a tolerance of +/- 0.1 degree to keep in mind when defining your coverage area. If some of the information is unknown, using a grib utility application such as wgrib and wgrib2 can be useful in determining the information that must be added to correctly process a new grib file. If you are experiencing Segmentation fault errors when running wgrib2, it may be best to install the latest version using the following command: yum install wgrib2 And then you may either need to change where wgrib2 points to, or use /bin/wgrib2 to run the recently downloaded version.","title":"Troubleshooting Grib Ingest"},{"location":"edex/settings/","text":"EDEX Settings \uf0c1 Plugin Configuration \uf0c1 The directory /awips2/edex/conf/resources contains configuration text files for specific plugins, which allow for user-defined values which are read by AWIPS plugins on EDEX start: acarssounding.properties autobldsrv.properties com.raytheon.edex.plugin.gfe.properties com.raytheon.edex.text.properties com.raytheon.uf.common.registry.ebxml.properties com.raytheon.uf.edex.archive.cron.properties com.raytheon.uf.edex.database.properties com.raytheon.uf.edex.registry.ebxml.properties distribution.properties edex-localization-http.properties edex-requestsrv.properties edex-uengine.properties eventBus.properties ftp.properties goesr.properties grib.properties maintenance.properties proxy.properties purge.properties quartz.properties radar.properties stats.properties textdbsrv.properties warning.properties Look at purge.properties for example: # Master switch to enable and disable purging purge.enabled=true # Interval at which the purge job kicks off purge.cron=0+0/15+*+*+*+? # Interval at which the outgoing files are purged purge.outgoing.cron=0+30+*+*+*+? # Interval at which the logs are purged purge.logs.cron=0+30+0+*+*+? # Interval at which hdf5 orphans are purged purge.orphan.period=24h # Number of days older than the earliest known data to delete. purge.orphan.buffer=7 ... In grib.properties , goesr.properties , and radar.properties you can adjust the number of decoder threads for each plugin. cat radar.properties # Number threads for radar products ingested from the SBN radar-decode.sbn.threads=5 Ingest Modes \uf0c1 By default, EDEX starts three \"modes\": ingest , ingestGrib , and request (each as its own JVM). The file /awips2/edex/conf/modes/modes.xml contains all available mode definitions, including some specific modes for Hydro Server Applications, ebXML Registries, Data Delivery, and more. EDEX services are registered through spring, and by including or excluding specific spring files (usually by datatype plugin name) we can finely customize EDEX startup. In /awips2/edex/conf/modes/modes.xml there are a number of unused plugin decoders excluded because the data are not available outside of the SBN: ... .*request.* edex-security.xml ebxml.*\\.xml grib-decode.xml grid-staticdata-process.xml .*(dpa|taf|nctext).* webservices.xml .*datadelivery.* .*bandwidth.* .*sbn-simulator.* hydrodualpol-ingest.xml grid-metadata.xml .*ogc.* obs-ingest-metarshef.xml ffmp-ingest.xml scan-ingest.xml cwat-ingest.xml fog-ingest.xml vil-ingest.xml preciprate-ingest.xml qpf-ingest.xml fssobs-ingest.xml cpgsrv-spring.xml ... In this example, request, ebXML, grib plugins, OGC and other plugins are excluded because they are included in their own mode/JVM. Note : TAF and NCTEXT plugins are disabled here due to performance issues. JVM Memory \uf0c1 The directory /awips2/edex/etc/ contains files which define the amount of memory used for each of the three EDEX JVMs (ingest, ingestGrib, request): ls -al /awips2/edex/etc/ -rw-r--r-- 1 awips fxalpha 1287 Jul 24 18:41 centralRegistry.sh -rw-r--r-- 1 awips fxalpha 1155 Jul 24 18:42 default.sh -rw-r--r-- 1 awips fxalpha 1956 Jul 24 18:41 ingestGrib.sh -rw-r--r-- 1 awips fxalpha 337 Jul 24 18:36 ingest.sh -rw-r--r-- 1 awips fxalpha 848 Jul 24 18:42 profiler.sh -rw-r--r-- 1 awips fxalpha 1188 Jul 24 18:41 registry.sh -rw-r--r-- 1 awips fxalpha 601 Jul 24 18:36 request.sh Each file contains the Xmx definition for maximum memory: ... export INIT_MEM=512 # in Meg export MAX_MEM=4096 # in Meg ... After editing these files, you must restart : service edex_camel restart .","title":"EDEX Settings"},{"location":"edex/settings/#edex-settings","text":"","title":"EDEX Settings"},{"location":"edex/settings/#plugin-configuration","text":"The directory /awips2/edex/conf/resources contains configuration text files for specific plugins, which allow for user-defined values which are read by AWIPS plugins on EDEX start: acarssounding.properties autobldsrv.properties com.raytheon.edex.plugin.gfe.properties com.raytheon.edex.text.properties com.raytheon.uf.common.registry.ebxml.properties com.raytheon.uf.edex.archive.cron.properties com.raytheon.uf.edex.database.properties com.raytheon.uf.edex.registry.ebxml.properties distribution.properties edex-localization-http.properties edex-requestsrv.properties edex-uengine.properties eventBus.properties ftp.properties goesr.properties grib.properties maintenance.properties proxy.properties purge.properties quartz.properties radar.properties stats.properties textdbsrv.properties warning.properties Look at purge.properties for example: # Master switch to enable and disable purging purge.enabled=true # Interval at which the purge job kicks off purge.cron=0+0/15+*+*+*+? # Interval at which the outgoing files are purged purge.outgoing.cron=0+30+*+*+*+? # Interval at which the logs are purged purge.logs.cron=0+30+0+*+*+? # Interval at which hdf5 orphans are purged purge.orphan.period=24h # Number of days older than the earliest known data to delete. purge.orphan.buffer=7 ... In grib.properties , goesr.properties , and radar.properties you can adjust the number of decoder threads for each plugin. cat radar.properties # Number threads for radar products ingested from the SBN radar-decode.sbn.threads=5","title":"Plugin Configuration"},{"location":"edex/settings/#ingest-modes","text":"By default, EDEX starts three \"modes\": ingest , ingestGrib , and request (each as its own JVM). The file /awips2/edex/conf/modes/modes.xml contains all available mode definitions, including some specific modes for Hydro Server Applications, ebXML Registries, Data Delivery, and more. EDEX services are registered through spring, and by including or excluding specific spring files (usually by datatype plugin name) we can finely customize EDEX startup. In /awips2/edex/conf/modes/modes.xml there are a number of unused plugin decoders excluded because the data are not available outside of the SBN: ... .*request.* edex-security.xml ebxml.*\\.xml grib-decode.xml grid-staticdata-process.xml .*(dpa|taf|nctext).* webservices.xml .*datadelivery.* .*bandwidth.* .*sbn-simulator.* hydrodualpol-ingest.xml grid-metadata.xml .*ogc.* obs-ingest-metarshef.xml ffmp-ingest.xml scan-ingest.xml cwat-ingest.xml fog-ingest.xml vil-ingest.xml preciprate-ingest.xml qpf-ingest.xml fssobs-ingest.xml cpgsrv-spring.xml ... In this example, request, ebXML, grib plugins, OGC and other plugins are excluded because they are included in their own mode/JVM. Note : TAF and NCTEXT plugins are disabled here due to performance issues.","title":"Ingest Modes"},{"location":"edex/settings/#jvm-memory","text":"The directory /awips2/edex/etc/ contains files which define the amount of memory used for each of the three EDEX JVMs (ingest, ingestGrib, request): ls -al /awips2/edex/etc/ -rw-r--r-- 1 awips fxalpha 1287 Jul 24 18:41 centralRegistry.sh -rw-r--r-- 1 awips fxalpha 1155 Jul 24 18:42 default.sh -rw-r--r-- 1 awips fxalpha 1956 Jul 24 18:41 ingestGrib.sh -rw-r--r-- 1 awips fxalpha 337 Jul 24 18:36 ingest.sh -rw-r--r-- 1 awips fxalpha 848 Jul 24 18:42 profiler.sh -rw-r--r-- 1 awips fxalpha 1188 Jul 24 18:41 registry.sh -rw-r--r-- 1 awips fxalpha 601 Jul 24 18:36 request.sh Each file contains the Xmx definition for maximum memory: ... export INIT_MEM=512 # in Meg export MAX_MEM=4096 # in Meg ... After editing these files, you must restart : service edex_camel restart .","title":"JVM Memory"},{"location":"install/install-azure/","text":"Azure Portal \uf0c1 create new virtual machine, CentOS 6.7 network rules for ports disk drive mount iptables All of these commands require root or sudo! Create user awips and group fxalpha and create AWIPS directories. groupadd fxalpha useradd -G fxalpha awips or add the existing user to the new group: groupadd fxalph usermod -a -G fxalpha awips /mnt/resource is a temporary scratch disk on Azure Linux VMs, which makes it an ideal spot for the LDM Raw Data Store (since we don't care about losing the files which would be purged within one hour anyway. mkdir /awips2 ln -s /mnt/resource /awips2/data_store Mount an Azure SSD to /awips2/edex/data see dmesg|grep sdc to know if you have one configured): fdisk /dev/sdc mkfs -t ext4 /dev/sdc1 mkdir -p /awips2/edex/data mount /dev/sdc1 /awips2/edex/data and in fstab UUID=0ed45b61-1b93-4d5e-a03c-0adc5ffce62a /awips2/edex/data ext4 defaults,discard 1 2 where UUID is found with the command ls -al /dev/disk/by-uuid Your system looks like this now Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.6G 26G 10% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdb1 111G 60M 105G 1% /mnt/resource /dev/sdc1 1007G 72M 956G 1% /awips2/edex/data and after install Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 8.2G 20G 30% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdb1 111G 60M 105G 1% /mnt/resource /dev/sdc1 1007G 2.3G 954G 1% /awips2/edex/data /dev/sda1 will contain the /awips2 software installation /dev/sdb1 will contain the LDM raw data store (sym link from /awips2/data_store) /dev/sdc1 will contain the EDEX processed sata store (mounted on /awips2/edex/data /etc/sysconfig/iptables To serve data from an EDEX server, iptables must allow TCP connections on ports 9581 and 9582 . The following lines added to /etc/sysconfig/iptables , followed by the command service iptables restart , will configure iptables for EDEX. -A INPUT -p tcp -m tcp --dport 9581 -j ACCEPT -A INPUT -p tcp -m tcp --dport 9582 -j ACCEPT Linux Download \uf0c1 For 64-bit RHEL/CentOS 6 and 7, download and run the script awips_install.sh --edex : wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 ./awips_install.sh sudo ./awips_install.sh --edex This will install to /awips2/edex , /awips2/database/data and other directories. CentOS/RHEL 6 and 7 are the only supported operating systems for EDEX (Though you may have luck with Fedora Core 12 to 14 and Scientific Linux). Not supported for EDEX: Debian, Ubuntu, SUSE, Solaris, OS X, Fedora 15+, Windows Be Aware... \uf0c1 selinux should be disabled (read more about selinux at redhat.com) Security Limits - /etc/security/limits.conf Qpid is known to crash on systems without a high security limit for user processes and files. The file /etc/security/limits.conf defines the number of each for the awips user (This is automatically configured by the awips_install.sh --edex script). awips soft nproc 65536 awips soft nofile 65536 LDM config regutil /hostname -s edex-cloud.westus.cloudapp.azure.com regutil /queue/size -s 2500M 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 edex-cloud.westus.cloudapp.azure.com ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 edex-cloud.westus.cloudapp.azure.com What does awips_install.sh --edex do? \uf0c1 Downloads https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo to /etc/yum.repos.d/awips2.repo Runs yum clean all Runs yum groupinstall awips2-server","title":"Azure Portal"},{"location":"install/install-azure/#azure-portal","text":"create new virtual machine, CentOS 6.7 network rules for ports disk drive mount iptables All of these commands require root or sudo! Create user awips and group fxalpha and create AWIPS directories. groupadd fxalpha useradd -G fxalpha awips or add the existing user to the new group: groupadd fxalph usermod -a -G fxalpha awips /mnt/resource is a temporary scratch disk on Azure Linux VMs, which makes it an ideal spot for the LDM Raw Data Store (since we don't care about losing the files which would be purged within one hour anyway. mkdir /awips2 ln -s /mnt/resource /awips2/data_store Mount an Azure SSD to /awips2/edex/data see dmesg|grep sdc to know if you have one configured): fdisk /dev/sdc mkfs -t ext4 /dev/sdc1 mkdir -p /awips2/edex/data mount /dev/sdc1 /awips2/edex/data and in fstab UUID=0ed45b61-1b93-4d5e-a03c-0adc5ffce62a /awips2/edex/data ext4 defaults,discard 1 2 where UUID is found with the command ls -al /dev/disk/by-uuid Your system looks like this now Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.6G 26G 10% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdb1 111G 60M 105G 1% /mnt/resource /dev/sdc1 1007G 72M 956G 1% /awips2/edex/data and after install Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 8.2G 20G 30% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdb1 111G 60M 105G 1% /mnt/resource /dev/sdc1 1007G 2.3G 954G 1% /awips2/edex/data /dev/sda1 will contain the /awips2 software installation /dev/sdb1 will contain the LDM raw data store (sym link from /awips2/data_store) /dev/sdc1 will contain the EDEX processed sata store (mounted on /awips2/edex/data /etc/sysconfig/iptables To serve data from an EDEX server, iptables must allow TCP connections on ports 9581 and 9582 . The following lines added to /etc/sysconfig/iptables , followed by the command service iptables restart , will configure iptables for EDEX. -A INPUT -p tcp -m tcp --dport 9581 -j ACCEPT -A INPUT -p tcp -m tcp --dport 9582 -j ACCEPT","title":"Azure Portal"},{"location":"install/install-azure/#linux-download","text":"For 64-bit RHEL/CentOS 6 and 7, download and run the script awips_install.sh --edex : wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 ./awips_install.sh sudo ./awips_install.sh --edex This will install to /awips2/edex , /awips2/database/data and other directories. CentOS/RHEL 6 and 7 are the only supported operating systems for EDEX (Though you may have luck with Fedora Core 12 to 14 and Scientific Linux). Not supported for EDEX: Debian, Ubuntu, SUSE, Solaris, OS X, Fedora 15+, Windows","title":"Linux Download"},{"location":"install/install-azure/#be-aware","text":"selinux should be disabled (read more about selinux at redhat.com) Security Limits - /etc/security/limits.conf Qpid is known to crash on systems without a high security limit for user processes and files. The file /etc/security/limits.conf defines the number of each for the awips user (This is automatically configured by the awips_install.sh --edex script). awips soft nproc 65536 awips soft nofile 65536 LDM config regutil /hostname -s edex-cloud.westus.cloudapp.azure.com regutil /queue/size -s 2500M 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 edex-cloud.westus.cloudapp.azure.com ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6 edex-cloud.westus.cloudapp.azure.com","title":"Be Aware..."},{"location":"install/install-azure/#what-does-awips_installsh-edex-do","text":"Downloads https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo to /etc/yum.repos.d/awips2.repo Runs yum clean all Runs yum groupinstall awips2-server","title":"What does awips_install.sh --edex do?"},{"location":"install/install-cave-beta-v20/","text":"Install CAVE - BETA Version! \uf0c1 CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. The installer may require administrator priviledges to install and may require other system changes (environment variables, etc) as well. Latest CAVE Versions \uf0c1 Linux: 20.3.2-0.4 Windows: 20.3.2-0.4 Mac: 20.3.2-0.4 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX. Functionality/Reporting \uf0c1 This is a beta release, so we are aware that not all functionality is working as expected. We ask you to please be aware of this and have similar expectations. One noteworthy deficiency we are aware of is the radar menu has not been updated yet to reflect what is in version 18. If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form . General Requirements \uf0c1 Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Local machine Running CAVE via X11 forwarding or ssh tunneling is not supported. Using a VNC connection is the only remote option , and may result in worse performance than running locally. OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability Linux \uf0c1 Latest Version: 20.3.2-0.4 System Requirements \uf0c1 64 bit CentOS/Red Hat 7 Bash shell environment While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Download and Installation Instructions \uf0c1 Download the following installer: awips_install-v20.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install-v20.sh Run the installer: sudo ./awips_install-v20.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/ Run CAVE \uf0c1 To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE Additionally users can choose to run a virtual machine (VM) on Linux. Windows \uf0c1 Latest Version: 20.3.2-0.4 For Windows, Unidata offers two installation options: a Direct Windows Installation , or a Linux Virtual Machine . The direct install is much easier/faster than v18*. The virtual machine option won't render RGB composites of satellite imagery. Method 1: Direct Windows Install \uf0c1 Download and Installation Instructions \uf0c1 Download and install: awips-cave.msi Run CAVE \uf0c1 To run CAVE, either: Double click on the CAVE icon on your desktop Type \"cave\" in the start bar and hit enter Find and run CAVE app in the file browser: C:\\Users\\%USER%\\AppData\\Roaming\\UCAR Unidata\\AWIPS CAVE\\CAVE.bat Method 2: Linux Virtual Machine \uf0c1 Please note, running CAVE in a Virtual Machine does have reduced functionality than running CAVE directly on hardware (ex: rendering RGB satellite images). System Requirements \uf0c1 VMWare Workstation Player must be installed (free software): For high definition monitors (4k), you will want to enable the high DPI setting for VMWare Workstation Player Create a desktop shortcut for VMWare Workstation Player Right-click the shortcut and select Properties Open the Compatability Tab Select the \"Change high DPI settings\" button Check the \"High DPI scaling ovveride\" checkbox and choose \"Application\" in the enabled dropdown Download and Installation Instructions \uf0c1 Download the zipped file containing the virtual machine: CentOS7-Unidata-CAVE-20.3.2-0.4 Unzip the folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE 20.3.2-0.4.vmx\" . Run this new VM option. If it asks if it's been moved or copied, select \"I Copied It\" . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed Run CAVE \uf0c1 Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE macOS \uf0c1 Latest Version: 20.3.2-0.4 System Requirements \uf0c1 Nvidia Graphics Card (Some Intel Graphics cards seem to work as well) Download and Installation Instructions \uf0c1 Download and install CAVE You can click and drag the CAVE icon into the Applications Directory to install at the System Application level -- this may require Administrator Privileges You can drag that icon to any other location (Desktop, local user's Applications directory, etc) to install CAVE at that location -- this will not require Administrator Privileges Run CAVE \uf0c1 To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it The first time CAVE is opened, it will ask you if you are sure you want to run it, because it was downloaded from the internet and not the Apple Store. This is normal, and hit Open. Your message my differ slightly but should look like the image below: EDEX Connection \uf0c1 Unidata and Jetstream2 have partnered to offer a EDEX data server in the cloud, open to the public. Select the server in the Connectivity Preferences dialog, or enter edex-beta.unidata.ucar.edu . Local Cache \uf0c1 After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home//caveData/ macOS: /Users//Library/caveData/ Windows: C:\\Users\\\\caveData\\ Uninstalling CAVE (Linux) \uf0c1 These are instructions to manually uninstall CAVE. However, the awips_install-v20.sh script will do these steps for you if you are installing a newer version of CAVE. 1. Make sure you have exited out of any CAVE sessions Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 . 2. Remove currently installed CAVE sudo yum clean all sudo yum groupremove \"AWIPS CAVE\" If you are having trouble removing a group, see the troubleshooting section. 3. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 4. Remove the cave directory in /awips2 and caveData from your home directory rm -rf /awips2/cave rm -rf ~/caveData","title":"Install CAVE - BETA Version!"},{"location":"install/install-cave-beta-v20/#install-cave-beta-version","text":"CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. The installer may require administrator priviledges to install and may require other system changes (environment variables, etc) as well.","title":"Install CAVE - BETA Version!"},{"location":"install/install-cave-beta-v20/#latest-cave-versions","text":"Linux: 20.3.2-0.4 Windows: 20.3.2-0.4 Mac: 20.3.2-0.4 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX.","title":"Latest CAVE Versions"},{"location":"install/install-cave-beta-v20/#functionalityreporting","text":"This is a beta release, so we are aware that not all functionality is working as expected. We ask you to please be aware of this and have similar expectations. One noteworthy deficiency we are aware of is the radar menu has not been updated yet to reflect what is in version 18. If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form .","title":"Functionality/Reporting"},{"location":"install/install-cave-beta-v20/#general-requirements","text":"Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Local machine Running CAVE via X11 forwarding or ssh tunneling is not supported. Using a VNC connection is the only remote option , and may result in worse performance than running locally. OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability","title":"General Requirements"},{"location":"install/install-cave-beta-v20/#linux","text":"Latest Version: 20.3.2-0.4","title":"Linux "},{"location":"install/install-cave-beta-v20/#system-requirements","text":"64 bit CentOS/Red Hat 7 Bash shell environment While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024.","title":"System Requirements"},{"location":"install/install-cave-beta-v20/#download-and-installation-instructions","text":"Download the following installer: awips_install-v20.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install-v20.sh Run the installer: sudo ./awips_install-v20.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/","title":"Download and Installation Instructions"},{"location":"install/install-cave-beta-v20/#run-cave","text":"To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE Additionally users can choose to run a virtual machine (VM) on Linux.","title":"Run CAVE"},{"location":"install/install-cave-beta-v20/#windows","text":"Latest Version: 20.3.2-0.4 For Windows, Unidata offers two installation options: a Direct Windows Installation , or a Linux Virtual Machine . The direct install is much easier/faster than v18*. The virtual machine option won't render RGB composites of satellite imagery.","title":"Windows "},{"location":"install/install-cave-beta-v20/#method-1-direct-windows-install","text":"","title":"Method 1: Direct Windows Install"},{"location":"install/install-cave-beta-v20/#download-and-installation-instructions_1","text":"Download and install: awips-cave.msi","title":"Download and Installation Instructions"},{"location":"install/install-cave-beta-v20/#run-cave_1","text":"To run CAVE, either: Double click on the CAVE icon on your desktop Type \"cave\" in the start bar and hit enter Find and run CAVE app in the file browser: C:\\Users\\%USER%\\AppData\\Roaming\\UCAR Unidata\\AWIPS CAVE\\CAVE.bat","title":"Run CAVE"},{"location":"install/install-cave-beta-v20/#method-2-linux-virtual-machine","text":"Please note, running CAVE in a Virtual Machine does have reduced functionality than running CAVE directly on hardware (ex: rendering RGB satellite images).","title":"Method 2: Linux Virtual Machine"},{"location":"install/install-cave-beta-v20/#system-requirements_1","text":"VMWare Workstation Player must be installed (free software): For high definition monitors (4k), you will want to enable the high DPI setting for VMWare Workstation Player Create a desktop shortcut for VMWare Workstation Player Right-click the shortcut and select Properties Open the Compatability Tab Select the \"Change high DPI settings\" button Check the \"High DPI scaling ovveride\" checkbox and choose \"Application\" in the enabled dropdown","title":"System Requirements"},{"location":"install/install-cave-beta-v20/#download-and-installation-instructions_2","text":"Download the zipped file containing the virtual machine: CentOS7-Unidata-CAVE-20.3.2-0.4 Unzip the folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE 20.3.2-0.4.vmx\" . Run this new VM option. If it asks if it's been moved or copied, select \"I Copied It\" . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed","title":"Download and Installation Instructions"},{"location":"install/install-cave-beta-v20/#run-cave_2","text":"Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE","title":"Run CAVE"},{"location":"install/install-cave-beta-v20/#macos","text":"Latest Version: 20.3.2-0.4","title":"macOS "},{"location":"install/install-cave-beta-v20/#system-requirements_2","text":"Nvidia Graphics Card (Some Intel Graphics cards seem to work as well)","title":"System Requirements"},{"location":"install/install-cave-beta-v20/#download-and-installation-instructions_3","text":"Download and install CAVE You can click and drag the CAVE icon into the Applications Directory to install at the System Application level -- this may require Administrator Privileges You can drag that icon to any other location (Desktop, local user's Applications directory, etc) to install CAVE at that location -- this will not require Administrator Privileges","title":"Download and Installation Instructions"},{"location":"install/install-cave-beta-v20/#run-cave_3","text":"To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it The first time CAVE is opened, it will ask you if you are sure you want to run it, because it was downloaded from the internet and not the Apple Store. This is normal, and hit Open. Your message my differ slightly but should look like the image below:","title":"Run CAVE"},{"location":"install/install-cave-beta-v20/#edex-connection","text":"Unidata and Jetstream2 have partnered to offer a EDEX data server in the cloud, open to the public. Select the server in the Connectivity Preferences dialog, or enter edex-beta.unidata.ucar.edu .","title":"EDEX Connection"},{"location":"install/install-cave-beta-v20/#local-cache","text":"After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home//caveData/ macOS: /Users//Library/caveData/ Windows: C:\\Users\\\\caveData\\","title":"Local Cache"},{"location":"install/install-cave-beta-v20/#uninstalling-cave-linux","text":"These are instructions to manually uninstall CAVE. However, the awips_install-v20.sh script will do these steps for you if you are installing a newer version of CAVE. 1. Make sure you have exited out of any CAVE sessions Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 . 2. Remove currently installed CAVE sudo yum clean all sudo yum groupremove \"AWIPS CAVE\" If you are having trouble removing a group, see the troubleshooting section. 3. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 4. Remove the cave directory in /awips2 and caveData from your home directory rm -rf /awips2/cave rm -rf ~/caveData","title":"Uninstalling CAVE (Linux)"},{"location":"install/install-cave/","text":"Install CAVE \uf0c1 CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. Unidata supports CAVE to work on three platforms: Centos (Redhat) Linux , Windows , and macOS . The installer may require administrator privileges to install and may require other system changes (environment variables, etc) as well. Latest CAVE Versions \uf0c1 Linux: 18.2.1-6 Windows: 18.2.1-6 Mac: 18.2.1-6 BETA Version: 20.3.2-0.4 View release notes General Requirements \uf0c1 Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Local machine Running CAVE via X11 forwarding or ssh tunneling is not supported. Using a VNC connection is the only remote option , and may result in worse performance than running locally. Java 1.8 OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability Linux \uf0c1 Latest Version: 18.2.1-6 System Requirements \uf0c1 64 bit CentOS/Red Hat 7 Bash shell environment While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Download and Installation Instructions \uf0c1 Download the following installer: awips_install.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install.sh Run the installer: sudo ./awips_install.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/ Run CAVE \uf0c1 To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE Additionally users can choose to run a virtual machine (VM) on Linux. Windows \uf0c1 Latest Version: 18.2.1-6 For Windows, Unidata offers two installation options: a Direct Windows Installation , or a Linux Virtual Machine . As of 4/24/2023, the direct install method has been completely simplified! No admin privileges or extra software needed. At the moment, the VM option may not render all products in CAVE (ex. RGB composites of satellite imagery) Method 1: Direct Windows Install \uf0c1 This method has been simplified to include python and java packaged with CAVE so no other software installation is necessary. CAVE no longer needs changes to any environment variables in order to run. Download and Installation Instructions \uf0c1 Download and install: awips-cave.msi This will install CAVE to the user level in %HOMEPATH%\\AppDir\\Roaming\\UCAR Unidata\\AWIPS CAVE . Run CAVE \uf0c1 To run CAVE, either: Double click on the CAVE shortcut on the desktop Type \"cave\" in the start bar and hit enter Method 2: Linux Virtual Machine \uf0c1 This is an additional installation method, however at this time, some CAVE functionality may be missing (ex: rendering RGB satellite images). System Requirements \uf0c1 VMWare Workstation Player must be installed (free software): For high definition monitors (4k), you will want to enable the high DPI setting for VMWare Workstation Player Create a desktop shortcut for VMWare Workstation Player Right-click the shortcut and select Properties Open the Compatability Tab Select the \"Change high DPI settings\" button Check the \"High DPI scaling ovveride\" checkbox and choose \"Application\" in the enabled dropdown Download and Installation Instructions \uf0c1 Download the zipped file containing the virtual machine: unidata_cave.zip Unzip the folder by right-clicking and selecting \"Extract All\". All files will be extracted into a new folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE 18-2-1-6.vmx\" . Run this new VM option. If it asks if it's been moved or copied, select \"I Copied It\" . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed Run CAVE \uf0c1 Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE macOS \uf0c1 Latest Version: 18.2.1-6 System Requirements \uf0c1 MacOS Monterey version 12.3 and above no longer supports Python2. This will cause several visualization aspects to fail in CAVE. If you update to MacOS 12.3 CAVE will not be fully functional. Please download and install our beta v20 of CAVE for newer MacOS versions. Will need admin privileges to install awips-python.pkg NVIDIA Graphics card is recommended, however some Intel Graphics cards will support a majority of the functionality Most AMD graphics cards are not supported Download and Installation Instructions \uf0c1 Download and install: awips-python.pkg (This step requires administrative privileges) Once downloaded, double click and the installer will launch with the following screens, please keep the default selections : Between these steps it will prompt for an administrator's password The awips-python.pkg is not necessarily required, and CAVE will still run without it, but any derived data such as barbs, arrows, and various grid products will not render without having jep installed (it is assumed to be in /Library/Python/2.7/site-packages/jep/) Download and install: awips-cave.dmg Either use the default location, which is in the system-wide \"Applications\" directory, by clicking and dragging the CAVE icon into the Applications folder, in the window that pops up when installing: Or open a new Finder window to your [user home]/Applications/ directory (if it doesn't exist, simply create a new folder with the name \"Applications\"), and drag the CAVE icon into that folder: This will install CAVE as an application and set the local cache to ~/Library/caveData Run CAVE \uf0c1 To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it The first time CAVE is opened, it will ask you if you are sure you want to run it, because it was downloaded from the internet and not the Apple Store. This is normal, and hit Open. Your message my differ slightly but should look like the image below: MacOS Monterey Warning \uf0c1 If you are running MacOS Monterey, you may see the following message when starting CAVE: This message can be ignored, and will hopefully go away when we release version 20+ of AWIPS. EDEX Connection \uf0c1 Unidata and XSEDE Jetstream have partnered to offer a EDEX data server in the cloud, open to the Unidata university community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after). Local Cache \uf0c1 After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home//caveData/ macOS: /Users//Library/caveData/ Windows: C:\\Users\\\\caveData\\ Uninstalling CAVE (Linux) \uf0c1 These are instructions to manually uninstall CAVE. However, the awips_install.sh script will do these steps for you if you are installing a newer version of CAVE. 1. Make sure you have exited out of any CAVE sessions Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 . 2. Remove currently installed CAVE sudo yum clean all sudo yum groupremove \"AWIPS CAVE\" If you are having trouble removing a group, see the troubleshooting section. 3. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 4. Remove the cave directory in /awips2 and caveData from your home directory rm -rf /awips2/cave rm -rf ~/caveData","title":"Install CAVE"},{"location":"install/install-cave/#install-cave","text":"CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. Unidata supports CAVE to work on three platforms: Centos (Redhat) Linux , Windows , and macOS . The installer may require administrator privileges to install and may require other system changes (environment variables, etc) as well.","title":"Install CAVE"},{"location":"install/install-cave/#latest-cave-versions","text":"Linux: 18.2.1-6 Windows: 18.2.1-6 Mac: 18.2.1-6 BETA Version: 20.3.2-0.4 View release notes","title":"Latest CAVE Versions"},{"location":"install/install-cave/#general-requirements","text":"Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Local machine Running CAVE via X11 forwarding or ssh tunneling is not supported. Using a VNC connection is the only remote option , and may result in worse performance than running locally. Java 1.8 OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability","title":"General Requirements"},{"location":"install/install-cave/#linux","text":"Latest Version: 18.2.1-6","title":"Linux "},{"location":"install/install-cave/#system-requirements","text":"64 bit CentOS/Red Hat 7 Bash shell environment While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024.","title":"System Requirements"},{"location":"install/install-cave/#download-and-installation-instructions","text":"Download the following installer: awips_install.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install.sh Run the installer: sudo ./awips_install.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave","text":"To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE Additionally users can choose to run a virtual machine (VM) on Linux.","title":"Run CAVE"},{"location":"install/install-cave/#windows","text":"Latest Version: 18.2.1-6 For Windows, Unidata offers two installation options: a Direct Windows Installation , or a Linux Virtual Machine . As of 4/24/2023, the direct install method has been completely simplified! No admin privileges or extra software needed. At the moment, the VM option may not render all products in CAVE (ex. RGB composites of satellite imagery)","title":"Windows "},{"location":"install/install-cave/#method-1-direct-windows-install","text":"This method has been simplified to include python and java packaged with CAVE so no other software installation is necessary. CAVE no longer needs changes to any environment variables in order to run.","title":"Method 1: Direct Windows Install"},{"location":"install/install-cave/#download-and-installation-instructions_1","text":"Download and install: awips-cave.msi This will install CAVE to the user level in %HOMEPATH%\\AppDir\\Roaming\\UCAR Unidata\\AWIPS CAVE .","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave_1","text":"To run CAVE, either: Double click on the CAVE shortcut on the desktop Type \"cave\" in the start bar and hit enter","title":"Run CAVE"},{"location":"install/install-cave/#method-2-linux-virtual-machine","text":"This is an additional installation method, however at this time, some CAVE functionality may be missing (ex: rendering RGB satellite images).","title":"Method 2: Linux Virtual Machine"},{"location":"install/install-cave/#system-requirements_1","text":"VMWare Workstation Player must be installed (free software): For high definition monitors (4k), you will want to enable the high DPI setting for VMWare Workstation Player Create a desktop shortcut for VMWare Workstation Player Right-click the shortcut and select Properties Open the Compatability Tab Select the \"Change high DPI settings\" button Check the \"High DPI scaling ovveride\" checkbox and choose \"Application\" in the enabled dropdown","title":"System Requirements"},{"location":"install/install-cave/#download-and-installation-instructions_2","text":"Download the zipped file containing the virtual machine: unidata_cave.zip Unzip the folder by right-clicking and selecting \"Extract All\". All files will be extracted into a new folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE 18-2-1-6.vmx\" . Run this new VM option. If it asks if it's been moved or copied, select \"I Copied It\" . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave_2","text":"Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE","title":"Run CAVE"},{"location":"install/install-cave/#macos","text":"Latest Version: 18.2.1-6","title":"macOS "},{"location":"install/install-cave/#system-requirements_2","text":"MacOS Monterey version 12.3 and above no longer supports Python2. This will cause several visualization aspects to fail in CAVE. If you update to MacOS 12.3 CAVE will not be fully functional. Please download and install our beta v20 of CAVE for newer MacOS versions. Will need admin privileges to install awips-python.pkg NVIDIA Graphics card is recommended, however some Intel Graphics cards will support a majority of the functionality Most AMD graphics cards are not supported","title":"System Requirements"},{"location":"install/install-cave/#download-and-installation-instructions_3","text":"Download and install: awips-python.pkg (This step requires administrative privileges) Once downloaded, double click and the installer will launch with the following screens, please keep the default selections : Between these steps it will prompt for an administrator's password The awips-python.pkg is not necessarily required, and CAVE will still run without it, but any derived data such as barbs, arrows, and various grid products will not render without having jep installed (it is assumed to be in /Library/Python/2.7/site-packages/jep/) Download and install: awips-cave.dmg Either use the default location, which is in the system-wide \"Applications\" directory, by clicking and dragging the CAVE icon into the Applications folder, in the window that pops up when installing: Or open a new Finder window to your [user home]/Applications/ directory (if it doesn't exist, simply create a new folder with the name \"Applications\"), and drag the CAVE icon into that folder: This will install CAVE as an application and set the local cache to ~/Library/caveData","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave_3","text":"To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it The first time CAVE is opened, it will ask you if you are sure you want to run it, because it was downloaded from the internet and not the Apple Store. This is normal, and hit Open. Your message my differ slightly but should look like the image below:","title":"Run CAVE"},{"location":"install/install-cave/#macos-monterey-warning","text":"If you are running MacOS Monterey, you may see the following message when starting CAVE: This message can be ignored, and will hopefully go away when we release version 20+ of AWIPS.","title":"MacOS Monterey Warning"},{"location":"install/install-cave/#edex-connection","text":"Unidata and XSEDE Jetstream have partnered to offer a EDEX data server in the cloud, open to the Unidata university community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after).","title":"EDEX Connection"},{"location":"install/install-cave/#local-cache","text":"After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home//caveData/ macOS: /Users//Library/caveData/ Windows: C:\\Users\\\\caveData\\","title":"Local Cache"},{"location":"install/install-cave/#uninstalling-cave-linux","text":"These are instructions to manually uninstall CAVE. However, the awips_install.sh script will do these steps for you if you are installing a newer version of CAVE. 1. Make sure you have exited out of any CAVE sessions Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 . 2. Remove currently installed CAVE sudo yum clean all sudo yum groupremove \"AWIPS CAVE\" If you are having trouble removing a group, see the troubleshooting section. 3. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 4. Remove the cave directory in /awips2 and caveData from your home directory rm -rf /awips2/cave rm -rf ~/caveData","title":"Uninstalling CAVE (Linux)"},{"location":"install/install-distributed/","text":"An example of a two-server configuration (LDM and EDEX seperately) using Microsoft Azure CentOS 7.2 virtual machines (Unidata EDEX is supported on CentOS/RHEL 7 since 16.2.2). cifs setup \uf0c1 Following the guide https://docs.microsoft.com/en-us/azure/storage/storage-how-to-use-files-linux , our two Azure VMs will share a single file storage directory mounted via Samba cifs . LDM will write to the file share, and EDEX will read from it to ingest and decode IDD products. In the Azure portal : Create a new Standard storage account (e.g. edex7203 ) Create a new File service within the storange account (e.g. datastore ), 100GB minimum. The file service will be located at //edex7203.file.core.windows.net/datastore Select the Configuration tab and confirm Standard Performance and Locally-redundant storage (LRS) for Replication (these should be defaults). Select the Access keys tab and copy one of the keys for /etc/fstab /etc/fstab should look like this ( for both machines ): UUID=0177d0ac-2605-4bfb-9873-5bdefea12fe2 / xfs defaults 0 0 //edex7203.file.core.windows.net/datastore /awips2/data_store cifs vers=3.0,password=YOUR_KEY_HERE,user=edex7203,dir_mode=0777,file_mode=0777 Note the YOUR_KEY_HERE placeholder above, that's where your key will go. Now run mount -a and confirm /awips2/data_store is mounted with the command df -h Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 7.4G 23G 25% / /dev/sdb1 14G 41M 13G 1% /mnt/resource //edex7203.file.core.windows.net/datastore 100G 1M 100G 1% /awips2/data_store EDEX server (10.0.0.1) \uf0c1 In the Azure portal : Create a new virtual machine with an awips user account CentOS 7.2 DS5_V2 Standard (16 cores, 56 GB) Ensure that this VM is on the same Virtual Network as the LDM machine (both on the 10.0.0.* subnet). Select the new vm, then select Disks , and modify the attached OS Disk to be 512GB or greater (vm must be stopped for this). Start the VM, log in as root, and follow the steps in the guide Step by Step: how to resize a Linux VM OS disk in Azure (with one dfference in step 5 below) fdisk /dev/sda type \" u \" to change the units to sectors. type \" p \" to list current partition details. type \" d \" to delete the current partition. type \" n \" to create a new partition. Select defaults (p for primary partition, 1 for first part). type \" w \" to write the partition. Reboot the machine and log in again (as root). Run xfs_growfs /dev/sda1 and check that the OS disk mounts with the new partition size with df -h We use xfs_growfs here for XFS here ( read more... ) instead of resize2fs for EXT2/EXT3/EXT4. yum install iptables-services vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state RELATED,ESTABLISHED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -p tcp -m state --state NEW -m tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 5672 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT # -A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT service iptables restart vi /etc/sysconfig/selinux ( read more about selinux at redhat.com ) # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted reboot for the selinux changes to take effect. Create user and group awips:fxalpha groupadd fxalpha && useradd -G fxalpha awips or if the awips account already exists: groupadd fxalpha && usermod -G fxalpha awips Finally, install the EDEX server wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 ./awips_install.sh sudo ./awips_install.sh --edex LDM server (10.0.0.2) \uf0c1 A small LDM server to write data files to the file share /awips2/data_store and send messages to the EDEX machine (10.0.0.1) via edexBridge . In the Azure portal : Create a new virtual machine with an awips user account CentOS 7.2 DS2_V2 Standard (2 cores, 7 GB) Start the VM, log in and sudo su - to root, then run wget -O /etc/yum.repos.d/awips2.repo https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo yum clean all yum groupinstall awips2-ldm-server vi /awips2/ldm/etc/ldmd.conf to define the edexBridge server nane EXEC \"edexBridge -s 10.0.0.1\" service edex_ldm start Note: You do not need to configure iptables on an LDM-only machine (only for EDEX).","title":"Install distributed"},{"location":"install/install-distributed/#cifs-setup","text":"Following the guide https://docs.microsoft.com/en-us/azure/storage/storage-how-to-use-files-linux , our two Azure VMs will share a single file storage directory mounted via Samba cifs . LDM will write to the file share, and EDEX will read from it to ingest and decode IDD products. In the Azure portal : Create a new Standard storage account (e.g. edex7203 ) Create a new File service within the storange account (e.g. datastore ), 100GB minimum. The file service will be located at //edex7203.file.core.windows.net/datastore Select the Configuration tab and confirm Standard Performance and Locally-redundant storage (LRS) for Replication (these should be defaults). Select the Access keys tab and copy one of the keys for /etc/fstab /etc/fstab should look like this ( for both machines ): UUID=0177d0ac-2605-4bfb-9873-5bdefea12fe2 / xfs defaults 0 0 //edex7203.file.core.windows.net/datastore /awips2/data_store cifs vers=3.0,password=YOUR_KEY_HERE,user=edex7203,dir_mode=0777,file_mode=0777 Note the YOUR_KEY_HERE placeholder above, that's where your key will go. Now run mount -a and confirm /awips2/data_store is mounted with the command df -h Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 7.4G 23G 25% / /dev/sdb1 14G 41M 13G 1% /mnt/resource //edex7203.file.core.windows.net/datastore 100G 1M 100G 1% /awips2/data_store","title":"cifs setup"},{"location":"install/install-distributed/#edex-server-10001","text":"In the Azure portal : Create a new virtual machine with an awips user account CentOS 7.2 DS5_V2 Standard (16 cores, 56 GB) Ensure that this VM is on the same Virtual Network as the LDM machine (both on the 10.0.0.* subnet). Select the new vm, then select Disks , and modify the attached OS Disk to be 512GB or greater (vm must be stopped for this). Start the VM, log in as root, and follow the steps in the guide Step by Step: how to resize a Linux VM OS disk in Azure (with one dfference in step 5 below) fdisk /dev/sda type \" u \" to change the units to sectors. type \" p \" to list current partition details. type \" d \" to delete the current partition. type \" n \" to create a new partition. Select defaults (p for primary partition, 1 for first part). type \" w \" to write the partition. Reboot the machine and log in again (as root). Run xfs_growfs /dev/sda1 and check that the OS disk mounts with the new partition size with df -h We use xfs_growfs here for XFS here ( read more... ) instead of resize2fs for EXT2/EXT3/EXT4. yum install iptables-services vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state RELATED,ESTABLISHED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -p tcp -m state --state NEW -m tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 5672 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT # -A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT service iptables restart vi /etc/sysconfig/selinux ( read more about selinux at redhat.com ) # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted reboot for the selinux changes to take effect. Create user and group awips:fxalpha groupadd fxalpha && useradd -G fxalpha awips or if the awips account already exists: groupadd fxalpha && usermod -G fxalpha awips Finally, install the EDEX server wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 ./awips_install.sh sudo ./awips_install.sh --edex","title":"EDEX server (10.0.0.1)"},{"location":"install/install-distributed/#ldm-server-10002","text":"A small LDM server to write data files to the file share /awips2/data_store and send messages to the EDEX machine (10.0.0.1) via edexBridge . In the Azure portal : Create a new virtual machine with an awips user account CentOS 7.2 DS2_V2 Standard (2 cores, 7 GB) Start the VM, log in and sudo su - to root, then run wget -O /etc/yum.repos.d/awips2.repo https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo yum clean all yum groupinstall awips2-ldm-server vi /awips2/ldm/etc/ldmd.conf to define the edexBridge server nane EXEC \"edexBridge -s 10.0.0.1\" service edex_ldm start Note: You do not need to configure iptables on an LDM-only machine (only for EDEX).","title":"LDM server (10.0.0.2)"},{"location":"install/install-edex-beta-v20/","text":"Install EDEX - BETA Version! \uf0c1 EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines Latest Version \uf0c1 20.3.2-0.4 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX. Functionality/Reporting \uf0c1 This is a beta release, so we are aware that not all functionality is working as expected. We ask you to please be aware of this and have similar expectations. One noteworthy deficiency we are aware of is the radar menu has not been updated yet to reflect what is in version 18. If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form . System requirements \uf0c1 64-bit CentOS/RHEL 7 While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Bash shell environment 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. EDEX is only supported for 64-bit CentOS and RHEL 7 Operating Systems. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows. You may have luck with Fedora Core 12 to 14 and Scientific Linux, but we will not provide support. Download and Installation Instructions \uf0c1 The first 3 steps should all be run as root 1. Install EDEX \uf0c1 Download and run the installer: awips_install.sh wget https://downloads.unidata.ucar.edu/awips2/20.3.2/linux/awips_install-v20.sh chmod 755 awips_install-v20.sh sudo ./awips_install-v20.sh --edex awips_install-v20.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Checks to see if EDEX is currently running, if so stops the processes with the edex stop command If EDEX is installed, asks the user if it can be removed and where to backup the data to and does a yum groupremove awips2-server If the user/group awips:fxalpha does not exist, it gets created Saves the appropriate yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server If you receive an error relating to yum, then please run sudo su - -c \"[PATH_TO_INSTALL_FILE]/awips_install-v20.sh --edex\" 2. EDEX Setup \uf0c1 The external and localhost addresses need to be specified in /etc/hosts 127.0.0.1 localhost localhost.localdomain XXX.XXX.XXX.XXX edex-cloud edex-cloud.unidata.ucar.edu 3. Configure iptables \uf0c1 This should be a one time configuration change. Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data publicly to CAVE clients and the Python API. Open Port 9588 \uf0c1 If you are running a Registry (Data Delivery) server, you will also want to open port 9588 . To open ports to all connections \uf0c1 vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT #-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT To open ports to specific IP addresses \uf0c1 In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -s 128.117.140.0/24 -j EDEX -A INPUT -s 128.117.156.0/24 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT #-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd -A EDEX -j REJECT COMMIT Restart iptables \uf0c1 service iptables restart Troubleshooting \uf0c1 For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service Failed to restart iptables.service: Unit iptables.service failed to load: No such file or directory. The solution is: yum install iptables-services systemctl enable iptables service iptables restart 4. Start EDEX \uf0c1 These steps should be run as user awips with sudo. Switch to the user by running su - awips . edex start To manually start, stop, and restart: service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. Start ldm manually: service edex_ldm start To restart EDEX edex restart Additional Notes \uf0c1 Ensure SELinux is Disabled \uf0c1 vi /etc/sysconfig/selinux # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted Read more about selinux at redhat.com SSD Mount \uf0c1 Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.5G 26G 9% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdc1 788G 81G 667G 11% /awips2 /dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5 Configure LDM Feeds \uf0c1 EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested: Configuration file: /awips2/ldm/etc/ldmd.conf \uf0c1 This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu REQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual Configuration File: /awips2/ldm/etc/pqact.conf \uf0c1 This file specifies the WMO headers and file pattern actions to request: # Redbook graphics ANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8}) FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H # NOAAPORT GINI images NIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..) FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds Configuration File: /awips2/ldm/etc/registry.xml \uf0c1 This file specifies configuration and runtime parameters. If you are pulling in a lot of data, you may want to consider increasing your LDM queue size: /awips2/ldm/var/queues/ldm.pq 24GB default Read more about registry.xml in the LDM User Manual Directories to Know \uf0c1 /awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint. What Version is my EDEX? \uf0c1 rpm -qa | grep awips2-edex Uninstalling EDEX \uf0c1 These are instructions to manually uninstall EDEX. However, the awips_install.sh script will do all of these steps for you if you are installing a newer version of EDEX. 1. Make sure all EDEX processes are stopped sudo edex stop sudo edex status [edex status] postgres :: not running pypies :: not running qpid :: not running EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: not running ldmadmin :: not running 2. Backup any important configuration files that you may want to reference Here are some possible important directories/files to backup: /awips2/database/data/pg_hba.conf /awips2/edex/data/utility/* /awips2/edex/bin/* /awips2/ldm/* /awips2/dev/* /awips2/edex/conf* /awips2/edex/etc/* /awips2/edex/logs/* /usr/bin/edex/* /etc/init.d/edexServiceList 3. See what AWIPS yum groups are currently installed In this case the AWIPS EDEX Server group is installed sudo yum grouplist Available Environment Groups: Minimal Install Compute Node Infrastructure Server File and Print Server Cinnamon Desktop MATE Desktop Basic Web Server Virtualization Host Server with GUI GNOME Desktop KDE Plasma Workspaces Development and Creative Workstation Installed Groups: AWIPS EDEX Server Development Tools Available Groups: AWIPS ADE SERVER AWIPS CAVE AWIPS Development AWIPS EDEX DAT Server AWIPS EDEX Database/Request Server AWIPS EDEX Decode/Ingest Node (No Database, PyPIES, GFE) Cinnamon Compatibility Libraries Console Internet Tools Educational Software Electronic Lab Fedora Packager General Purpose Desktop Graphical Administration Tools Haskell LXQt Desktop Legacy UNIX Compatibility MATE Milkymist Scientific Support Security Tools Smart Card Support System Administration Tools System Management TurboGears application framework Xfce 4. Remove any currently installed AWIPS yum groups sudo yum clean all sudo yum groupremove \"AWIPS EDEX Server\" If you are having trouble removing a group, see the troubleshooting section. 5. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 6. Remove everything in the /awips2 directory rm -rf /awips2/*","title":"Install EDEX - BETA Version! "},{"location":"install/install-edex-beta-v20/#install-edex-beta-version","text":"EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines","title":"Install EDEX - BETA Version! "},{"location":"install/install-edex-beta-v20/#latest-version","text":"20.3.2-0.4 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX.","title":"Latest Version"},{"location":"install/install-edex-beta-v20/#functionalityreporting","text":"This is a beta release, so we are aware that not all functionality is working as expected. We ask you to please be aware of this and have similar expectations. One noteworthy deficiency we are aware of is the radar menu has not been updated yet to reflect what is in version 18. If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form .","title":"Functionality/Reporting"},{"location":"install/install-edex-beta-v20/#system-requirements","text":"64-bit CentOS/RHEL 7 While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Bash shell environment 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. EDEX is only supported for 64-bit CentOS and RHEL 7 Operating Systems. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows. You may have luck with Fedora Core 12 to 14 and Scientific Linux, but we will not provide support.","title":"System requirements"},{"location":"install/install-edex-beta-v20/#download-and-installation-instructions","text":"The first 3 steps should all be run as root","title":"Download and Installation Instructions"},{"location":"install/install-edex-beta-v20/#1-install-edex","text":"Download and run the installer: awips_install.sh wget https://downloads.unidata.ucar.edu/awips2/20.3.2/linux/awips_install-v20.sh chmod 755 awips_install-v20.sh sudo ./awips_install-v20.sh --edex awips_install-v20.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Checks to see if EDEX is currently running, if so stops the processes with the edex stop command If EDEX is installed, asks the user if it can be removed and where to backup the data to and does a yum groupremove awips2-server If the user/group awips:fxalpha does not exist, it gets created Saves the appropriate yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server If you receive an error relating to yum, then please run sudo su - -c \"[PATH_TO_INSTALL_FILE]/awips_install-v20.sh --edex\"","title":"1. Install EDEX"},{"location":"install/install-edex-beta-v20/#2-edex-setup","text":"The external and localhost addresses need to be specified in /etc/hosts 127.0.0.1 localhost localhost.localdomain XXX.XXX.XXX.XXX edex-cloud edex-cloud.unidata.ucar.edu","title":"2. EDEX Setup"},{"location":"install/install-edex-beta-v20/#3-configure-iptables","text":"This should be a one time configuration change. Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data publicly to CAVE clients and the Python API.","title":"3. Configure iptables"},{"location":"install/install-edex-beta-v20/#open-port-9588","text":"If you are running a Registry (Data Delivery) server, you will also want to open port 9588 .","title":"Open Port 9588"},{"location":"install/install-edex-beta-v20/#to-open-ports-to-all-connections","text":"vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT #-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT","title":"To open ports to all connections"},{"location":"install/install-edex-beta-v20/#to-open-ports-to-specific-ip-addresses","text":"In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -s 128.117.140.0/24 -j EDEX -A INPUT -s 128.117.156.0/24 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT #-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd -A EDEX -j REJECT COMMIT","title":"To open ports to specific IP addresses"},{"location":"install/install-edex-beta-v20/#restart-iptables","text":"service iptables restart","title":"Restart iptables"},{"location":"install/install-edex-beta-v20/#troubleshooting","text":"For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service Failed to restart iptables.service: Unit iptables.service failed to load: No such file or directory. The solution is: yum install iptables-services systemctl enable iptables service iptables restart","title":"Troubleshooting"},{"location":"install/install-edex-beta-v20/#4-start-edex","text":"These steps should be run as user awips with sudo. Switch to the user by running su - awips . edex start To manually start, stop, and restart: service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. Start ldm manually: service edex_ldm start To restart EDEX edex restart","title":"4. Start EDEX"},{"location":"install/install-edex-beta-v20/#additional-notes","text":"","title":"Additional Notes"},{"location":"install/install-edex-beta-v20/#ensure-selinux-is-disabled","text":"vi /etc/sysconfig/selinux # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted Read more about selinux at redhat.com","title":"Ensure SELinux is Disabled"},{"location":"install/install-edex-beta-v20/#ssd-mount","text":"Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.5G 26G 9% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdc1 788G 81G 667G 11% /awips2 /dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5","title":"SSD Mount"},{"location":"install/install-edex-beta-v20/#configure-ldm-feeds","text":"EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested:","title":"Configure LDM Feeds"},{"location":"install/install-edex-beta-v20/#configuration-file-awips2ldmetcldmdconf","text":"This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu REQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual","title":"Configuration file: /awips2/ldm/etc/ldmd.conf"},{"location":"install/install-edex-beta-v20/#configuration-file-awips2ldmetcpqactconf","text":"This file specifies the WMO headers and file pattern actions to request: # Redbook graphics ANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8}) FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H # NOAAPORT GINI images NIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..) FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds","title":"Configuration File: /awips2/ldm/etc/pqact.conf"},{"location":"install/install-edex-beta-v20/#configuration-file-awips2ldmetcregistryxml","text":"This file specifies configuration and runtime parameters. If you are pulling in a lot of data, you may want to consider increasing your LDM queue size: /awips2/ldm/var/queues/ldm.pq 24GB default Read more about registry.xml in the LDM User Manual","title":"Configuration File: /awips2/ldm/etc/registry.xml"},{"location":"install/install-edex-beta-v20/#directories-to-know","text":"/awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint.","title":"Directories to Know"},{"location":"install/install-edex-beta-v20/#what-version-is-my-edex","text":"rpm -qa | grep awips2-edex","title":"What Version is my EDEX?"},{"location":"install/install-edex-beta-v20/#uninstalling-edex","text":"These are instructions to manually uninstall EDEX. However, the awips_install.sh script will do all of these steps for you if you are installing a newer version of EDEX. 1. Make sure all EDEX processes are stopped sudo edex stop sudo edex status [edex status] postgres :: not running pypies :: not running qpid :: not running EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: not running ldmadmin :: not running 2. Backup any important configuration files that you may want to reference Here are some possible important directories/files to backup: /awips2/database/data/pg_hba.conf /awips2/edex/data/utility/* /awips2/edex/bin/* /awips2/ldm/* /awips2/dev/* /awips2/edex/conf* /awips2/edex/etc/* /awips2/edex/logs/* /usr/bin/edex/* /etc/init.d/edexServiceList 3. See what AWIPS yum groups are currently installed In this case the AWIPS EDEX Server group is installed sudo yum grouplist Available Environment Groups: Minimal Install Compute Node Infrastructure Server File and Print Server Cinnamon Desktop MATE Desktop Basic Web Server Virtualization Host Server with GUI GNOME Desktop KDE Plasma Workspaces Development and Creative Workstation Installed Groups: AWIPS EDEX Server Development Tools Available Groups: AWIPS ADE SERVER AWIPS CAVE AWIPS Development AWIPS EDEX DAT Server AWIPS EDEX Database/Request Server AWIPS EDEX Decode/Ingest Node (No Database, PyPIES, GFE) Cinnamon Compatibility Libraries Console Internet Tools Educational Software Electronic Lab Fedora Packager General Purpose Desktop Graphical Administration Tools Haskell LXQt Desktop Legacy UNIX Compatibility MATE Milkymist Scientific Support Security Tools Smart Card Support System Administration Tools System Management TurboGears application framework Xfce 4. Remove any currently installed AWIPS yum groups sudo yum clean all sudo yum groupremove \"AWIPS EDEX Server\" If you are having trouble removing a group, see the troubleshooting section. 5. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 6. Remove everything in the /awips2 directory rm -rf /awips2/*","title":"Uninstalling EDEX"},{"location":"install/install-edex/","text":"Install EDEX \uf0c1 EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines Latest Version \uf0c1 18.2.1-6 BETA Version: 20.3.2-0.4 View release notes System requirements \uf0c1 64-bit CentOS/RHEL 7 While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Bash shell environment 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package Run rpm -qa | grep gcc-c++ to verify if the package is installed If it is not installed, run yum install gcc-c++ to install the package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. EDEX is only supported for 64-bit CentOS and RHEL 7 Operating Systems. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows. You may have luck with Fedora Core 12 to 14 and Scientific Linux, but we will not provide support. Download and Installation Instructions \uf0c1 The first 3 steps should all be run as root 1. Install EDEX \uf0c1 Download and run the installer: awips_install.sh wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --edex awips_install.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Checks to see if EDEX is currently running, if so stops the processes with the edex stop command If EDEX is installed, asks the user if it can be removed and where to backup the data to and does a yum groupremove awips2-server If the user/group awips:fxalpha does not exist, it gets created Saves the appropriate yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server If you receive an error relating to yum, then please run sudo su - -c \"[PATH_TO_INSTALL_FILE]/awips_install.sh --edex\" 2. EDEX Setup \uf0c1 The external and localhost addresses need to be specified in /etc/hosts 127.0.0.1 localhost localhost.localdomain XXX.XXX.XXX.XXX edex-cloud edex-cloud.unidata.ucar.edu 3. Configure iptables \uf0c1 This should be a one time configuration change. Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data publicly to CAVE clients and the Python API. Open Port 9588 \uf0c1 If you are running a Registry (Data Delivery) server, you will also want to open port 9588 . To open ports to all connections \uf0c1 vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT #-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT To open ports to specific IP addresses \uf0c1 In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -s 128.117.140.0/24 -j EDEX -A INPUT -s 128.117.156.0/24 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT #-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd -A EDEX -j REJECT COMMIT Restart iptables \uf0c1 service iptables restart Troubleshooting \uf0c1 For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service Failed to restart iptables.service: Unit iptables.service failed to load: No such file or directory. The solution is: yum install iptables-services systemctl enable iptables service iptables restart 4. Start EDEX \uf0c1 These steps should be run as user awips with sudo. Switch to the user by running su - awips . edex start To manually start, stop, and restart: service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. Start ldm manually: service edex_ldm start To restart EDEX edex restart Additional Notes \uf0c1 Ensure SELinux is Disabled \uf0c1 vi /etc/sysconfig/selinux # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted Read more about selinux at redhat.com SSD Mount \uf0c1 Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.5G 26G 9% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdc1 788G 81G 667G 11% /awips2 /dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5 Configure LDM Feeds \uf0c1 EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested: Configuration file: /awips2/ldm/etc/ldmd.conf \uf0c1 This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu REQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual Configuration File: /awips2/ldm/etc/pqact.conf \uf0c1 This file specifies the WMO headers and file pattern actions to request: # Redbook graphics ANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8}) FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H # NOAAPORT GINI images NIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..) FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds Configuration File: /awips2/ldm/etc/registry.xml \uf0c1 This file specifies configuration and runtime parameters. If you are pulling in a lot of data, you may want to consider increasing your LDM queue size: /awips2/ldm/var/queues/ldm.pq 24GB default Read more about registry.xml in the LDM User Manual Directories to Know \uf0c1 /awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint. What Version is my EDEX? \uf0c1 rpm -qa | grep awips2-edex Uninstalling EDEX \uf0c1 These are instructions to manually uninstall EDEX. However, the awips_install.sh script will do all of these steps for you if you are installing a newer version of EDEX. 1. Make sure all EDEX processes are stopped sudo edex stop sudo edex status [edex status] postgres :: not running pypies :: not running qpid :: not running EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: not running ldmadmin :: not running 2. Backup any important configuration files that you may want to reference Here are some possible important directories/files to backup: /awips2/database/data/pg_hba.conf /awips2/edex/data/utility/* /awips2/edex/bin/* /awips2/ldm/* /awips2/dev/* /awips2/edex/conf* /awips2/edex/etc/* /awips2/edex/logs/* /usr/bin/edex/* /etc/init.d/edexServiceList 3. See what AWIPS yum groups are currently installed In this case the AWIPS EDEX Server group is installed sudo yum grouplist Available Environment Groups: Minimal Install Compute Node Infrastructure Server File and Print Server Cinnamon Desktop MATE Desktop Basic Web Server Virtualization Host Server with GUI GNOME Desktop KDE Plasma Workspaces Development and Creative Workstation Installed Groups: AWIPS EDEX Server Development Tools Available Groups: AWIPS ADE SERVER AWIPS CAVE AWIPS Development AWIPS EDEX DAT Server AWIPS EDEX Database/Request Server AWIPS EDEX Decode/Ingest Node (No Database, PyPIES, GFE) Cinnamon Compatibility Libraries Console Internet Tools Educational Software Electronic Lab Fedora Packager General Purpose Desktop Graphical Administration Tools Haskell LXQt Desktop Legacy UNIX Compatibility MATE Milkymist Scientific Support Security Tools Smart Card Support System Administration Tools System Management TurboGears application framework Xfce 4. Remove any currently installed AWIPS yum groups sudo yum clean all sudo yum groupremove \"AWIPS EDEX Server\" If you are having trouble removing a group, see the troubleshooting section. 5. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 6. Remove everything in the /awips2 directory rm -rf /awips2/*","title":"Install EDEX"},{"location":"install/install-edex/#install-edex","text":"EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines","title":"Install EDEX "},{"location":"install/install-edex/#latest-version","text":"18.2.1-6 BETA Version: 20.3.2-0.4 View release notes","title":"Latest Version"},{"location":"install/install-edex/#system-requirements","text":"64-bit CentOS/RHEL 7 While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Bash shell environment 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package Run rpm -qa | grep gcc-c++ to verify if the package is installed If it is not installed, run yum install gcc-c++ to install the package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. EDEX is only supported for 64-bit CentOS and RHEL 7 Operating Systems. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows. You may have luck with Fedora Core 12 to 14 and Scientific Linux, but we will not provide support.","title":"System requirements"},{"location":"install/install-edex/#download-and-installation-instructions","text":"The first 3 steps should all be run as root","title":"Download and Installation Instructions"},{"location":"install/install-edex/#1-install-edex","text":"Download and run the installer: awips_install.sh wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --edex awips_install.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Checks to see if EDEX is currently running, if so stops the processes with the edex stop command If EDEX is installed, asks the user if it can be removed and where to backup the data to and does a yum groupremove awips2-server If the user/group awips:fxalpha does not exist, it gets created Saves the appropriate yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server If you receive an error relating to yum, then please run sudo su - -c \"[PATH_TO_INSTALL_FILE]/awips_install.sh --edex\"","title":"1. Install EDEX"},{"location":"install/install-edex/#2-edex-setup","text":"The external and localhost addresses need to be specified in /etc/hosts 127.0.0.1 localhost localhost.localdomain XXX.XXX.XXX.XXX edex-cloud edex-cloud.unidata.ucar.edu","title":"2. EDEX Setup"},{"location":"install/install-edex/#3-configure-iptables","text":"This should be a one time configuration change. Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data publicly to CAVE clients and the Python API.","title":"3. Configure iptables"},{"location":"install/install-edex/#open-port-9588","text":"If you are running a Registry (Data Delivery) server, you will also want to open port 9588 .","title":"Open Port 9588"},{"location":"install/install-edex/#to-open-ports-to-all-connections","text":"vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT #-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT","title":"To open ports to all connections"},{"location":"install/install-edex/#to-open-ports-to-specific-ip-addresses","text":"In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -s 128.117.140.0/24 -j EDEX -A INPUT -s 128.117.156.0/24 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT #-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd -A EDEX -j REJECT COMMIT","title":"To open ports to specific IP addresses"},{"location":"install/install-edex/#restart-iptables","text":"service iptables restart","title":"Restart iptables"},{"location":"install/install-edex/#troubleshooting","text":"For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service Failed to restart iptables.service: Unit iptables.service failed to load: No such file or directory. The solution is: yum install iptables-services systemctl enable iptables service iptables restart","title":"Troubleshooting"},{"location":"install/install-edex/#4-start-edex","text":"These steps should be run as user awips with sudo. Switch to the user by running su - awips . edex start To manually start, stop, and restart: service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. Start ldm manually: service edex_ldm start To restart EDEX edex restart","title":"4. Start EDEX"},{"location":"install/install-edex/#additional-notes","text":"","title":"Additional Notes"},{"location":"install/install-edex/#ensure-selinux-is-disabled","text":"vi /etc/sysconfig/selinux # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted Read more about selinux at redhat.com","title":"Ensure SELinux is Disabled"},{"location":"install/install-edex/#ssd-mount","text":"Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.5G 26G 9% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdc1 788G 81G 667G 11% /awips2 /dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5","title":"SSD Mount"},{"location":"install/install-edex/#configure-ldm-feeds","text":"EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested:","title":"Configure LDM Feeds"},{"location":"install/install-edex/#configuration-file-awips2ldmetcldmdconf","text":"This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu REQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual","title":"Configuration file: /awips2/ldm/etc/ldmd.conf"},{"location":"install/install-edex/#configuration-file-awips2ldmetcpqactconf","text":"This file specifies the WMO headers and file pattern actions to request: # Redbook graphics ANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8}) FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H # NOAAPORT GINI images NIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..) FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds","title":"Configuration File: /awips2/ldm/etc/pqact.conf"},{"location":"install/install-edex/#configuration-file-awips2ldmetcregistryxml","text":"This file specifies configuration and runtime parameters. If you are pulling in a lot of data, you may want to consider increasing your LDM queue size: /awips2/ldm/var/queues/ldm.pq 24GB default Read more about registry.xml in the LDM User Manual","title":"Configuration File: /awips2/ldm/etc/registry.xml"},{"location":"install/install-edex/#directories-to-know","text":"/awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint.","title":"Directories to Know"},{"location":"install/install-edex/#what-version-is-my-edex","text":"rpm -qa | grep awips2-edex","title":"What Version is my EDEX?"},{"location":"install/install-edex/#uninstalling-edex","text":"These are instructions to manually uninstall EDEX. However, the awips_install.sh script will do all of these steps for you if you are installing a newer version of EDEX. 1. Make sure all EDEX processes are stopped sudo edex stop sudo edex status [edex status] postgres :: not running pypies :: not running qpid :: not running EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: not running ldmadmin :: not running 2. Backup any important configuration files that you may want to reference Here are some possible important directories/files to backup: /awips2/database/data/pg_hba.conf /awips2/edex/data/utility/* /awips2/edex/bin/* /awips2/ldm/* /awips2/dev/* /awips2/edex/conf* /awips2/edex/etc/* /awips2/edex/logs/* /usr/bin/edex/* /etc/init.d/edexServiceList 3. See what AWIPS yum groups are currently installed In this case the AWIPS EDEX Server group is installed sudo yum grouplist Available Environment Groups: Minimal Install Compute Node Infrastructure Server File and Print Server Cinnamon Desktop MATE Desktop Basic Web Server Virtualization Host Server with GUI GNOME Desktop KDE Plasma Workspaces Development and Creative Workstation Installed Groups: AWIPS EDEX Server Development Tools Available Groups: AWIPS ADE SERVER AWIPS CAVE AWIPS Development AWIPS EDEX DAT Server AWIPS EDEX Database/Request Server AWIPS EDEX Decode/Ingest Node (No Database, PyPIES, GFE) Cinnamon Compatibility Libraries Console Internet Tools Educational Software Electronic Lab Fedora Packager General Purpose Desktop Graphical Administration Tools Haskell LXQt Desktop Legacy UNIX Compatibility MATE Milkymist Scientific Support Security Tools Smart Card Support System Administration Tools System Management TurboGears application framework Xfce 4. Remove any currently installed AWIPS yum groups sudo yum clean all sudo yum groupremove \"AWIPS EDEX Server\" If you are having trouble removing a group, see the troubleshooting section. 5. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 6. Remove everything in the /awips2 directory rm -rf /awips2/*","title":"Uninstalling EDEX"},{"location":"install/start-edex/","text":"EDEX Basic Commands \uf0c1 These steps should be run as user awips with sudo. Switch to the user by running su - awips . Unidata's EDEX install also comes with a simple edex program that can help execute basic EDEX utilities. The most basic of the commands are the following: To start all EDEX services: edex start To stop all EDEX services: edex stop Service and Boot Settings \uf0c1 These commands will start and stop five EDEX service files installed into /etc/init.d/ , four of which are run on boot: service postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running: service edex_ldm start All of these services are started and stopped by the single program: edex as mentioned above. LDM Troubleshooting \uf0c1 If the EDEX machine is shut down abruptly, when restarted, it should start up the processes mentioned above . If sudo service edex_ldm start does not start up LDM smoothly, please try these steps: All of the following commands should be run as user awips and the service commands may need to be run with sudo . Run sudo service edex_ldm start or ldmadmin start and recieve this message: ldmadmin start start_ldm(): PID-file \"/awips2/ldm/ldmd.pid\" exists. Verify that all is well and then execute \"ldmadmin clean\" to remove the PID-file. Run ldmadmin clean and sudo service edex_ldm start and receive this error: ldmadmin clean sudo service edex_ldm start Checking the product-queue... The writer-counter of the product-queue isn't zero. Either a process has the product-queue open for writing or the queue might be corrupt. Terminate the process and recheck or use pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq to validate the queue and set the writer-counter to zero. LDM not started To resolve the above, run: pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq ldmadmin delqueue ldmadmin mkqueue sudo service edex_ldm start EDEX Commands \uf0c1 Unidata's version of EDEX installs with a helpful edex script that can be used for basic EDEX tasks. edex start \uf0c1 edex start Starting EDEX PostgreSQL: [ OK ] Starting httpd: [ OK ] Starting QPID [ OK ] Starting EDEX Camel (request): Starting EDEX Camel (ingest): Starting EDEX Camel (ingestGrib): Starting AWIPS LDM:The product-queue is OK. ... edex start base \uf0c1 To start all EDEX services except the LDM: edex start base edex stop \uf0c1 edex stop Stopping EDEX Camel (request): Stopping EDEX Camel (ingest): Stopping EDEX Camel (ingestGrib): Stopping QPID [ OK ] Stopping httpd: [ OK ] Stopping EDEX PostgreSQL: [ OK ] Stopping AWIPS LDM:Stopping the LDM server... ... edex setup \uf0c1 edex setup [edex] EDEX IP and Hostname Setup Checking /awips2/database/data/pg_hba.conf [OK] Checking /awips2/edex/bin/setup.env [OK] [edit] Hostname edex.unidata.ucar.edu added to /awips2/ldm/etc/ldmd.conf [done] This command configures and/or confirms that the EDEX hostname and IP address definitions exist ( edex setup is run by edex start ). Note : If your EDEX server is running but you see the message \"Connectivity Error: Unable to validate localization preferences\" in CAVE, it may mean that the domain name defined in /awips2/edex/bin/setup.env can not be resolved from outside the server. Some machines have different internally-resolved and externally-resolved domain names (cloud-based especially). The name defined in setup.env must be externally-resolvable . edex log \uf0c1 edex log [edex] EDEX Log Viewer :: No log specified - Defaulting to ingest log :: Viewing /awips2/edex/logs/edex-ingest-20151209.log. Press CTRL+C to exit INFO [Ingest.binlightning-1] /awips2/data_store/SFPA42_KWBC_091833_38031177.2015120918 processed in: 0.0050 (sec) Latency: 0.0550 (sec) INFO [Ingest.obs-1] /awips2/data_store/metar/SAIN31_VABB_091830_131392869.2015120918 processed in: 0.0810 (sec) Latency: 0.1800 (sec) More edex logs... edex log grib edex log request edex log ldm edex log radar edex log satellite edex log text edex qpid \uf0c1 Shows a list of the the Qpid message queue to monitor data ingest (messages in vs messages out, i.e. decoded): [centos@js-156-89 ~]$ edex qpid Queues queue dur excl msg msgIn msgOut bytes bytesIn bytesOut cons bind ================================================================================================ external.dropbox Y Y 11 1.26m 1.26m 621 79.6m 79.6m 5 1 Ingest.Radar Y Y 4 589k 589k 184 27.1m 27.1m 5 1 Ingest.GribDecode Y Y 0 370k 370k 0 103m 103m 11 1 Ingest.GribSplit Y Y 2 361k 361k 201 31.9m 31.9m 5 1 Ingest.modelsounding Y Y 0 100k 100k 0 6.54m 6.54m 1 1 Ingest.Text Y Y 0 97.8k 97.8k 0 5.25m 5.25m 2 1 Ingest.GOESR Y Y 0 83.4k 83.4k 0 6.92m 6.92m 2 1 Ingest.obs Y Y 0 46.2k 46.2k 0 2.40m 2.40m 1 1 Grid.PostProcess Y Y 0 20.2k 20.2k 0 6.68m 6.68m 1 1 Ingest.sfcobs Y Y 0 10.5k 10.5k 0 577k 577k 1 1 Ingest.goessounding Y Y 0 6.68k 6.68k 0 427k 427k 1 1 Ingest.Glm Y Y 0 5.61k 5.61k 0 581k 581k 1 1 Ingest.aww Y Y 0 3.32k 3.32k 0 182k 182k 1 1 edex users \uf0c1 To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu edex purge \uf0c1 To view any stuck purge jobs in PortgreSQL (a rare but serious problem if your disk fills up). The solution to this is to run edex purge reset .","title":"EDEX Basic Commands"},{"location":"install/start-edex/#edex-basic-commands","text":"These steps should be run as user awips with sudo. Switch to the user by running su - awips . Unidata's EDEX install also comes with a simple edex program that can help execute basic EDEX utilities. The most basic of the commands are the following: To start all EDEX services: edex start To stop all EDEX services: edex stop","title":"EDEX Basic Commands"},{"location":"install/start-edex/#service-and-boot-settings","text":"These commands will start and stop five EDEX service files installed into /etc/init.d/ , four of which are run on boot: service postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running: service edex_ldm start All of these services are started and stopped by the single program: edex as mentioned above.","title":"Service and Boot Settings"},{"location":"install/start-edex/#ldm-troubleshooting","text":"If the EDEX machine is shut down abruptly, when restarted, it should start up the processes mentioned above . If sudo service edex_ldm start does not start up LDM smoothly, please try these steps: All of the following commands should be run as user awips and the service commands may need to be run with sudo . Run sudo service edex_ldm start or ldmadmin start and recieve this message: ldmadmin start start_ldm(): PID-file \"/awips2/ldm/ldmd.pid\" exists. Verify that all is well and then execute \"ldmadmin clean\" to remove the PID-file. Run ldmadmin clean and sudo service edex_ldm start and receive this error: ldmadmin clean sudo service edex_ldm start Checking the product-queue... The writer-counter of the product-queue isn't zero. Either a process has the product-queue open for writing or the queue might be corrupt. Terminate the process and recheck or use pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq to validate the queue and set the writer-counter to zero. LDM not started To resolve the above, run: pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq ldmadmin delqueue ldmadmin mkqueue sudo service edex_ldm start","title":"LDM Troubleshooting"},{"location":"install/start-edex/#edex-commands","text":"Unidata's version of EDEX installs with a helpful edex script that can be used for basic EDEX tasks.","title":"EDEX Commands"},{"location":"install/start-edex/#edex-start","text":"edex start Starting EDEX PostgreSQL: [ OK ] Starting httpd: [ OK ] Starting QPID [ OK ] Starting EDEX Camel (request): Starting EDEX Camel (ingest): Starting EDEX Camel (ingestGrib): Starting AWIPS LDM:The product-queue is OK. ...","title":"edex start"},{"location":"install/start-edex/#edex-start-base","text":"To start all EDEX services except the LDM: edex start base","title":"edex start base"},{"location":"install/start-edex/#edex-stop","text":"edex stop Stopping EDEX Camel (request): Stopping EDEX Camel (ingest): Stopping EDEX Camel (ingestGrib): Stopping QPID [ OK ] Stopping httpd: [ OK ] Stopping EDEX PostgreSQL: [ OK ] Stopping AWIPS LDM:Stopping the LDM server... ...","title":"edex stop"},{"location":"install/start-edex/#edex-setup","text":"edex setup [edex] EDEX IP and Hostname Setup Checking /awips2/database/data/pg_hba.conf [OK] Checking /awips2/edex/bin/setup.env [OK] [edit] Hostname edex.unidata.ucar.edu added to /awips2/ldm/etc/ldmd.conf [done] This command configures and/or confirms that the EDEX hostname and IP address definitions exist ( edex setup is run by edex start ). Note : If your EDEX server is running but you see the message \"Connectivity Error: Unable to validate localization preferences\" in CAVE, it may mean that the domain name defined in /awips2/edex/bin/setup.env can not be resolved from outside the server. Some machines have different internally-resolved and externally-resolved domain names (cloud-based especially). The name defined in setup.env must be externally-resolvable .","title":"edex setup"},{"location":"install/start-edex/#edex-log","text":"edex log [edex] EDEX Log Viewer :: No log specified - Defaulting to ingest log :: Viewing /awips2/edex/logs/edex-ingest-20151209.log. Press CTRL+C to exit INFO [Ingest.binlightning-1] /awips2/data_store/SFPA42_KWBC_091833_38031177.2015120918 processed in: 0.0050 (sec) Latency: 0.0550 (sec) INFO [Ingest.obs-1] /awips2/data_store/metar/SAIN31_VABB_091830_131392869.2015120918 processed in: 0.0810 (sec) Latency: 0.1800 (sec) More edex logs... edex log grib edex log request edex log ldm edex log radar edex log satellite edex log text","title":"edex log"},{"location":"install/start-edex/#edex-qpid","text":"Shows a list of the the Qpid message queue to monitor data ingest (messages in vs messages out, i.e. decoded): [centos@js-156-89 ~]$ edex qpid Queues queue dur excl msg msgIn msgOut bytes bytesIn bytesOut cons bind ================================================================================================ external.dropbox Y Y 11 1.26m 1.26m 621 79.6m 79.6m 5 1 Ingest.Radar Y Y 4 589k 589k 184 27.1m 27.1m 5 1 Ingest.GribDecode Y Y 0 370k 370k 0 103m 103m 11 1 Ingest.GribSplit Y Y 2 361k 361k 201 31.9m 31.9m 5 1 Ingest.modelsounding Y Y 0 100k 100k 0 6.54m 6.54m 1 1 Ingest.Text Y Y 0 97.8k 97.8k 0 5.25m 5.25m 2 1 Ingest.GOESR Y Y 0 83.4k 83.4k 0 6.92m 6.92m 2 1 Ingest.obs Y Y 0 46.2k 46.2k 0 2.40m 2.40m 1 1 Grid.PostProcess Y Y 0 20.2k 20.2k 0 6.68m 6.68m 1 1 Ingest.sfcobs Y Y 0 10.5k 10.5k 0 577k 577k 1 1 Ingest.goessounding Y Y 0 6.68k 6.68k 0 427k 427k 1 1 Ingest.Glm Y Y 0 5.61k 5.61k 0 581k 581k 1 1 Ingest.aww Y Y 0 3.32k 3.32k 0 182k 182k 1 1","title":"edex qpid"},{"location":"install/start-edex/#edex-users","text":"To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu","title":"edex users"},{"location":"install/start-edex/#edex-purge","text":"To view any stuck purge jobs in PortgreSQL (a rare but serious problem if your disk fills up). The solution to this is to run edex purge reset .","title":"edex purge"},{"location":"install/starting-services/","text":"All EDEX services are started and stopped with the commands edex start and edex stop , and individual services can be started in the following order service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start service edex_ldm start Services can be stopped in reverse order service edex_ldm stop service edex_camel stop service qpidd stop service httpd-pypies stop service edex_postgres stop The service config files are located in /etc/init.d/ : ls -la /etc/init.d/ |grep -e edex -e pypies -e qpid -rwxr--r-- 1 root root 6693 Nov 7 17:53 edex_camel -rwxr-xr-x 1 root root 1422 Oct 29 15:28 edex_ldm -rwxr--r-- 1 root root 2416 Sep 7 15:48 edex_postgres -rwxr-xr-x 1 root root 5510 Aug 26 13:05 httpd-pypies -rwxr-xr-x 1 root root 3450 Aug 26 13:04 qpidd","title":"Starting services"},{"location":"pdf/","text":"AWIPS System Manager\u2019s Manual: Operational Build 16.2.2 AWIPS CAVE D2D User Manual AWIPS Site Migration Guide AWIPS Flow Tag Instructions: ADE Setup AWIPS Operational Build 16.2.2 Final Release Notes From http://collaborate2.nws.noaa.gov/partners/","title":"NWS Release Documentation"},{"location":"python/awips-grids-and-cartopy/","text":"The simplest example of requesting and plotting AWIPS gridded data with Matplotlib and Cartopy. from awips.dataaccess import DataAccessLayer import cartopy.crs as ccrs import matplotlib.pyplot as plt from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER %matplotlib inline DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest() request.setDatatype(\"grid\") request.setLocationNames(\"RAP13\") request.setParameters(\"T\") request.setLevels(\"2.0FHAG\") cycles = DataAccessLayer.getAvailableTimes(request, True) times = DataAccessLayer.getAvailableTimes(request) fcstRun = DataAccessLayer.getForecastRun(cycles[-1], times) response = DataAccessLayer.getGridData(request, [fcstRun[0]]) grid = response[0] data = grid.getRawData() lons, lats = grid.getLatLonCoords() bbox = [lons.min(), lons.max(), lats.min(), lats.max()] def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(16, 9), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax with pcolormesh \uf0c1 cmap = plt.get_cmap('rainbow') fig, ax = make_map(bbox=bbox) cs = ax.pcolormesh(lons, lats, data, cmap=cmap) cbar = fig.colorbar(cs, shrink=0.7, orientation='horizontal') cbar.set_label(str(grid.getLocationName()) +\" \" \\ + str(grid.getLevel()) + \" \" \\ + str(grid.getParameter()) \\ + \" (\" + str(grid.getUnit()) + \") \" \\ + \"valid \" + str(grid.getDataTime().getRefTime())) with contourf \uf0c1 fig2, ax2 = make_map(bbox=bbox) cs2 = ax2.contourf(lons, lats, data, 80, cmap=cmap, vmin=data.min(), vmax=data.max()) cbar2 = fig2.colorbar(cs2, shrink=0.7, orientation='horizontal') cbar2.set_label(str(grid.getLocationName()) +\" \" \\ + str(grid.getLevel()) + \" \" \\ + str(grid.getParameter()) \\ + \" (\" + str(grid.getUnit()) + \") \" \\ + \"valid \" + str(grid.getDataTime().getRefTime()))","title":"Awips grids and cartopy"},{"location":"python/awips-grids-and-cartopy/#with-pcolormesh","text":"cmap = plt.get_cmap('rainbow') fig, ax = make_map(bbox=bbox) cs = ax.pcolormesh(lons, lats, data, cmap=cmap) cbar = fig.colorbar(cs, shrink=0.7, orientation='horizontal') cbar.set_label(str(grid.getLocationName()) +\" \" \\ + str(grid.getLevel()) + \" \" \\ + str(grid.getParameter()) \\ + \" (\" + str(grid.getUnit()) + \") \" \\ + \"valid \" + str(grid.getDataTime().getRefTime()))","title":"with pcolormesh"},{"location":"python/awips-grids-and-cartopy/#with-contourf","text":"fig2, ax2 = make_map(bbox=bbox) cs2 = ax2.contourf(lons, lats, data, 80, cmap=cmap, vmin=data.min(), vmax=data.max()) cbar2 = fig2.colorbar(cs2, shrink=0.7, orientation='horizontal') cbar2.set_label(str(grid.getLocationName()) +\" \" \\ + str(grid.getLevel()) + \" \" \\ + str(grid.getParameter()) \\ + \" (\" + str(grid.getUnit()) + \") \" \\ + \"valid \" + str(grid.getDataTime().getRefTime()))","title":"with contourf"},{"location":"python/derived-parameters/","text":"","title":"Derived parameters"},{"location":"python/map-resources-and-topography/","text":"The python-awips package provides access to the entire AWIPS Maps Database for use in Python GIS applications. Map objects are returned as Shapely geometries ( Polygon , Point , MultiLineString , etc.) and can be plotted by Matplotlib, Cartopy, MetPy, and other packages. Each map database table has a geometry field called the_geom , which can be used to spatially select map resources for any column of type geometry. See the Maps Database Reference Page for available database tables, column names, and types. Notes \uf0c1 This notebook requires: python-awips, numpy, matplotplib, cartopy, shapely Use datatype maps and addIdentifier('table', ) to define the map table: DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest('maps') request.addIdentifier('table', 'mapdata.county') Use request.setLocationNames() and request.addIdentifier() to spatially filter a map resource. In the example below, WFO ID BOU (Boulder, Colorado) is used to query counties within the BOU county watch area (CWA) request.addIdentifier('geomField', 'the_geom') request.addIdentifier('inLocation', 'true') request.addIdentifier('locationField', 'cwa') request.setLocationNames('BOU') request.addIdentifier('cwa', 'BOU') Note the geometry definition of the_geom for each data type, which can be Point , MultiPolygon , or MultiLineString . Setup \uf0c1 from __future__ import print_function from awips.dataaccess import DataAccessLayer import matplotlib.pyplot as plt import cartopy.crs as ccrs import numpy as np from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER from cartopy.feature import ShapelyFeature,NaturalEarthFeature from shapely.geometry import Polygon from shapely.ops import cascaded_union # Standard map plot def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(12,12), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax # Server, Data Request Type, and Database Table DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest('maps') request.addIdentifier('table', 'mapdata.county') Request County Boundaries for a WFO \uf0c1 Use request.setParameters() to define fields to be returned by the request. # Define a WFO ID for location # tie this ID to the mapdata.county column \"cwa\" for filtering request.setLocationNames('BOU') request.addIdentifier('cwa', 'BOU') # enable location filtering (inLocation) # locationField is tied to the above cwa definition (BOU) request.addIdentifier('geomField', 'the_geom') request.addIdentifier('inLocation', 'true') request.addIdentifier('locationField', 'cwa') # This is essentially the same as \"'\"select count(*) from mapdata.cwa where cwa='BOU';\" (=1) # Get response and create dict of county geometries response = DataAccessLayer.getGeometryData(request, []) counties = np.array([]) for ob in response: counties = np.append(counties,ob.getGeometry()) print(\"Using \" + str(len(counties)) + \" county MultiPolygons\") %matplotlib inline # All WFO counties merged to a single Polygon merged_counties = cascaded_union(counties) envelope = merged_counties.buffer(2) boundaries=[merged_counties] # Get bounds of this merged Polygon to use as buffered map extent bounds = merged_counties.bounds bbox=[bounds[0]-1,bounds[2]+1,bounds[1]-1.5,bounds[3]+1.5] fig, ax = make_map(bbox=bbox) # Plot political/state boundaries handled by Cartopy political_boundaries = NaturalEarthFeature(category='cultural', name='admin_0_boundary_lines_land', scale='50m', facecolor='none') states = NaturalEarthFeature(category='cultural', name='admin_1_states_provinces_lines', scale='50m', facecolor='none') ax.add_feature(political_boundaries, linestyle='-', edgecolor='black') ax.add_feature(states, linestyle='-', edgecolor='black',linewidth=2) # Plot CWA counties for i, geom in enumerate(counties): cbounds = Polygon(geom) intersection = cbounds.intersection geoms = (intersection(geom) for geom in counties if cbounds.intersects(geom)) shape_feature = ShapelyFeature(geoms,ccrs.PlateCarree(), facecolor='none', linestyle=\"-\",edgecolor='#86989B') ax.add_feature(shape_feature) Using 25 county MultiPolygons Create a merged CWA with cascaded_union \uf0c1 # Plot CWA envelope for i, geom in enumerate(boundaries): gbounds = Polygon(geom) intersection = gbounds.intersection geoms = (intersection(geom) for geom in boundaries if gbounds.intersects(geom)) shape_feature = ShapelyFeature(geoms,ccrs.PlateCarree(), facecolor='none', linestyle=\"-\",linewidth=3.,edgecolor='#cc5000') ax.add_feature(shape_feature) fig WFO boundary spatial filter for interstates \uf0c1 Using the previously-defined envelope=merged_counties.buffer(2) in newDataRequest() to request geometries which fall inside the buffered boundary. request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.interstate') request.addIdentifier('geomField', 'the_geom') request.addIdentifier('locationField', 'hwy_type') request.addIdentifier('hwy_type', 'I') # I (interstate), U (US highway), or S (state highway) request.setParameters('name') interstates = DataAccessLayer.getGeometryData(request, []) print(\"Using \" + str(len(interstates)) + \" interstate MultiLineStrings\") # Plot interstates for ob in interstates: shape_feature = ShapelyFeature(ob.getGeometry(),ccrs.PlateCarree(), facecolor='none', linestyle=\"-\",edgecolor='orange') ax.add_feature(shape_feature) fig Using 223 interstate MultiLineStrings Road type from select distinct(hwy_type) from mapdata.interstate; I - Interstates U - US Highways S - State Highways Nearby cities \uf0c1 Request the city table and filter by population and progressive disclosure level: Warning : the prog_disc field is not entirely understood and values appear to change significantly depending on WFO site. request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.city') request.addIdentifier('geomField', 'the_geom') request.setParameters('name','population','prog_disc') cities = DataAccessLayer.getGeometryData(request, []) print(\"Found \" + str(len(cities)) + \" city Points\") Found 1201 city Points citylist = [] cityname = [] # For BOU, progressive disclosure values above 50 and pop above 5000 looks good for ob in cities: if ((ob.getNumber(\"prog_disc\")>50) and int(ob.getString(\"population\")) > 5000): citylist.append(ob.getGeometry()) cityname.append(ob.getString(\"name\")) print(\"Using \" + str(len(cityname)) + \" city Points\") # Plot city markers ax.scatter([point.x for point in citylist], [point.y for point in citylist], transform=ccrs.Geodetic(),marker=\"+\",facecolor='black') # Plot city names for i, txt in enumerate(cityname): ax.annotate(txt, (citylist[i].x,citylist[i].y), xytext=(3,3), textcoords=\"offset points\") fig Using 57 city Points Lakes \uf0c1 request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.lake') request.addIdentifier('geomField', 'the_geom') request.setParameters('name') # Get lake geometries response = DataAccessLayer.getGeometryData(request, []) lakes = np.array([]) for ob in response: lakes = np.append(lakes,ob.getGeometry()) print(\"Using \" + str(len(lakes)) + \" lake MultiPolygons\") # Plot lakes for i, geom in enumerate(lakes): cbounds = Polygon(geom) intersection = cbounds.intersection geoms = (intersection(geom) for geom in lakes if cbounds.intersects(geom)) shape_feature = ShapelyFeature(geoms,ccrs.PlateCarree(), facecolor='blue', linestyle=\"-\",edgecolor='#20B2AA') ax.add_feature(shape_feature) fig Using 208 lake MultiPolygons Major Rivers \uf0c1 request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.majorrivers') request.addIdentifier('geomField', 'the_geom') request.setParameters('pname') rivers = DataAccessLayer.getGeometryData(request, []) print(\"Using \" + str(len(rivers)) + \" river MultiLineStrings\") # Plot rivers for ob in rivers: shape_feature = ShapelyFeature(ob.getGeometry(),ccrs.PlateCarree(), facecolor='none', linestyle=\":\",edgecolor='#20B2AA') ax.add_feature(shape_feature) fig Using 758 river MultiLineStrings Topography \uf0c1 Spatial envelopes are required for topo requests, which can become slow to download and render for large (CONUS) maps. import numpy.ma as ma request = DataAccessLayer.newDataRequest() request.setDatatype(\"topo\") request.addIdentifier(\"group\", \"/\") request.addIdentifier(\"dataset\", \"full\") request.setEnvelope(envelope) gridData = DataAccessLayer.getGridData(request) print(gridData) print(\"Number of grid records: \" + str(len(gridData))) print(\"Sample grid data shape:\\n\" + str(gridData[0].getRawData().shape) + \"\\n\") print(\"Sample grid data:\\n\" + str(gridData[0].getRawData()) + \"\\n\") [] Number of grid records: 1 Sample grid data shape: (778, 1058) Sample grid data: [[ 1694. 1693. 1688. ..., 757. 761. 762.] [ 1701. 1701. 1701. ..., 758. 760. 762.] [ 1703. 1703. 1703. ..., 760. 761. 762.] ..., [ 1767. 1741. 1706. ..., 769. 762. 768.] [ 1767. 1746. 1716. ..., 775. 765. 761.] [ 1781. 1753. 1730. ..., 766. 762. 759.]] grid=gridData[0] topo=ma.masked_invalid(grid.getRawData()) lons, lats = grid.getLatLonCoords() print(topo.min()) print(topo.max()) # Plot topography cs = ax.contourf(lons, lats, topo, 80, cmap=plt.get_cmap('terrain'),alpha=0.1) cbar = fig.colorbar(cs, extend='both', shrink=0.5, orientation='horizontal') cbar.set_label(\"topography height in meters\") fig 623.0 4328.0","title":"Map resources and topography"},{"location":"python/map-resources-and-topography/#notes","text":"This notebook requires: python-awips, numpy, matplotplib, cartopy, shapely Use datatype maps and addIdentifier('table', ) to define the map table: DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest('maps') request.addIdentifier('table', 'mapdata.county') Use request.setLocationNames() and request.addIdentifier() to spatially filter a map resource. In the example below, WFO ID BOU (Boulder, Colorado) is used to query counties within the BOU county watch area (CWA) request.addIdentifier('geomField', 'the_geom') request.addIdentifier('inLocation', 'true') request.addIdentifier('locationField', 'cwa') request.setLocationNames('BOU') request.addIdentifier('cwa', 'BOU') Note the geometry definition of the_geom for each data type, which can be Point , MultiPolygon , or MultiLineString .","title":"Notes"},{"location":"python/map-resources-and-topography/#setup","text":"from __future__ import print_function from awips.dataaccess import DataAccessLayer import matplotlib.pyplot as plt import cartopy.crs as ccrs import numpy as np from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER from cartopy.feature import ShapelyFeature,NaturalEarthFeature from shapely.geometry import Polygon from shapely.ops import cascaded_union # Standard map plot def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(12,12), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax # Server, Data Request Type, and Database Table DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest('maps') request.addIdentifier('table', 'mapdata.county')","title":"Setup"},{"location":"python/map-resources-and-topography/#request-county-boundaries-for-a-wfo","text":"Use request.setParameters() to define fields to be returned by the request. # Define a WFO ID for location # tie this ID to the mapdata.county column \"cwa\" for filtering request.setLocationNames('BOU') request.addIdentifier('cwa', 'BOU') # enable location filtering (inLocation) # locationField is tied to the above cwa definition (BOU) request.addIdentifier('geomField', 'the_geom') request.addIdentifier('inLocation', 'true') request.addIdentifier('locationField', 'cwa') # This is essentially the same as \"'\"select count(*) from mapdata.cwa where cwa='BOU';\" (=1) # Get response and create dict of county geometries response = DataAccessLayer.getGeometryData(request, []) counties = np.array([]) for ob in response: counties = np.append(counties,ob.getGeometry()) print(\"Using \" + str(len(counties)) + \" county MultiPolygons\") %matplotlib inline # All WFO counties merged to a single Polygon merged_counties = cascaded_union(counties) envelope = merged_counties.buffer(2) boundaries=[merged_counties] # Get bounds of this merged Polygon to use as buffered map extent bounds = merged_counties.bounds bbox=[bounds[0]-1,bounds[2]+1,bounds[1]-1.5,bounds[3]+1.5] fig, ax = make_map(bbox=bbox) # Plot political/state boundaries handled by Cartopy political_boundaries = NaturalEarthFeature(category='cultural', name='admin_0_boundary_lines_land', scale='50m', facecolor='none') states = NaturalEarthFeature(category='cultural', name='admin_1_states_provinces_lines', scale='50m', facecolor='none') ax.add_feature(political_boundaries, linestyle='-', edgecolor='black') ax.add_feature(states, linestyle='-', edgecolor='black',linewidth=2) # Plot CWA counties for i, geom in enumerate(counties): cbounds = Polygon(geom) intersection = cbounds.intersection geoms = (intersection(geom) for geom in counties if cbounds.intersects(geom)) shape_feature = ShapelyFeature(geoms,ccrs.PlateCarree(), facecolor='none', linestyle=\"-\",edgecolor='#86989B') ax.add_feature(shape_feature) Using 25 county MultiPolygons","title":"Request County Boundaries for a WFO"},{"location":"python/map-resources-and-topography/#create-a-merged-cwa-with-cascaded_union","text":"# Plot CWA envelope for i, geom in enumerate(boundaries): gbounds = Polygon(geom) intersection = gbounds.intersection geoms = (intersection(geom) for geom in boundaries if gbounds.intersects(geom)) shape_feature = ShapelyFeature(geoms,ccrs.PlateCarree(), facecolor='none', linestyle=\"-\",linewidth=3.,edgecolor='#cc5000') ax.add_feature(shape_feature) fig","title":"Create a merged CWA with cascaded_union"},{"location":"python/map-resources-and-topography/#wfo-boundary-spatial-filter-for-interstates","text":"Using the previously-defined envelope=merged_counties.buffer(2) in newDataRequest() to request geometries which fall inside the buffered boundary. request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.interstate') request.addIdentifier('geomField', 'the_geom') request.addIdentifier('locationField', 'hwy_type') request.addIdentifier('hwy_type', 'I') # I (interstate), U (US highway), or S (state highway) request.setParameters('name') interstates = DataAccessLayer.getGeometryData(request, []) print(\"Using \" + str(len(interstates)) + \" interstate MultiLineStrings\") # Plot interstates for ob in interstates: shape_feature = ShapelyFeature(ob.getGeometry(),ccrs.PlateCarree(), facecolor='none', linestyle=\"-\",edgecolor='orange') ax.add_feature(shape_feature) fig Using 223 interstate MultiLineStrings Road type from select distinct(hwy_type) from mapdata.interstate; I - Interstates U - US Highways S - State Highways","title":"WFO boundary spatial filter for interstates"},{"location":"python/map-resources-and-topography/#nearby-cities","text":"Request the city table and filter by population and progressive disclosure level: Warning : the prog_disc field is not entirely understood and values appear to change significantly depending on WFO site. request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.city') request.addIdentifier('geomField', 'the_geom') request.setParameters('name','population','prog_disc') cities = DataAccessLayer.getGeometryData(request, []) print(\"Found \" + str(len(cities)) + \" city Points\") Found 1201 city Points citylist = [] cityname = [] # For BOU, progressive disclosure values above 50 and pop above 5000 looks good for ob in cities: if ((ob.getNumber(\"prog_disc\")>50) and int(ob.getString(\"population\")) > 5000): citylist.append(ob.getGeometry()) cityname.append(ob.getString(\"name\")) print(\"Using \" + str(len(cityname)) + \" city Points\") # Plot city markers ax.scatter([point.x for point in citylist], [point.y for point in citylist], transform=ccrs.Geodetic(),marker=\"+\",facecolor='black') # Plot city names for i, txt in enumerate(cityname): ax.annotate(txt, (citylist[i].x,citylist[i].y), xytext=(3,3), textcoords=\"offset points\") fig Using 57 city Points","title":"Nearby cities"},{"location":"python/map-resources-and-topography/#lakes","text":"request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.lake') request.addIdentifier('geomField', 'the_geom') request.setParameters('name') # Get lake geometries response = DataAccessLayer.getGeometryData(request, []) lakes = np.array([]) for ob in response: lakes = np.append(lakes,ob.getGeometry()) print(\"Using \" + str(len(lakes)) + \" lake MultiPolygons\") # Plot lakes for i, geom in enumerate(lakes): cbounds = Polygon(geom) intersection = cbounds.intersection geoms = (intersection(geom) for geom in lakes if cbounds.intersects(geom)) shape_feature = ShapelyFeature(geoms,ccrs.PlateCarree(), facecolor='blue', linestyle=\"-\",edgecolor='#20B2AA') ax.add_feature(shape_feature) fig Using 208 lake MultiPolygons","title":"Lakes"},{"location":"python/map-resources-and-topography/#major-rivers","text":"request = DataAccessLayer.newDataRequest('maps', envelope=envelope) request.addIdentifier('table', 'mapdata.majorrivers') request.addIdentifier('geomField', 'the_geom') request.setParameters('pname') rivers = DataAccessLayer.getGeometryData(request, []) print(\"Using \" + str(len(rivers)) + \" river MultiLineStrings\") # Plot rivers for ob in rivers: shape_feature = ShapelyFeature(ob.getGeometry(),ccrs.PlateCarree(), facecolor='none', linestyle=\":\",edgecolor='#20B2AA') ax.add_feature(shape_feature) fig Using 758 river MultiLineStrings","title":"Major Rivers"},{"location":"python/map-resources-and-topography/#topography","text":"Spatial envelopes are required for topo requests, which can become slow to download and render for large (CONUS) maps. import numpy.ma as ma request = DataAccessLayer.newDataRequest() request.setDatatype(\"topo\") request.addIdentifier(\"group\", \"/\") request.addIdentifier(\"dataset\", \"full\") request.setEnvelope(envelope) gridData = DataAccessLayer.getGridData(request) print(gridData) print(\"Number of grid records: \" + str(len(gridData))) print(\"Sample grid data shape:\\n\" + str(gridData[0].getRawData().shape) + \"\\n\") print(\"Sample grid data:\\n\" + str(gridData[0].getRawData()) + \"\\n\") [] Number of grid records: 1 Sample grid data shape: (778, 1058) Sample grid data: [[ 1694. 1693. 1688. ..., 757. 761. 762.] [ 1701. 1701. 1701. ..., 758. 760. 762.] [ 1703. 1703. 1703. ..., 760. 761. 762.] ..., [ 1767. 1741. 1706. ..., 769. 762. 768.] [ 1767. 1746. 1716. ..., 775. 765. 761.] [ 1781. 1753. 1730. ..., 766. 762. 759.]] grid=gridData[0] topo=ma.masked_invalid(grid.getRawData()) lons, lats = grid.getLatLonCoords() print(topo.min()) print(topo.max()) # Plot topography cs = ax.contourf(lons, lats, topo, 80, cmap=plt.get_cmap('terrain'),alpha=0.1) cbar = fig.colorbar(cs, extend='both', shrink=0.5, orientation='horizontal') cbar.set_label(\"topography height in meters\") fig 623.0 4328.0","title":"Topography"},{"location":"python/maps-database/","text":"mapdata.airport \uf0c1 Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok mapdata.allrivers \uf0c1 Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326) mapdata.artcc \uf0c1 Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326) mapdata.basins \uf0c1 Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326) mapdata.canada \uf0c1 Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326) mapdata.city \uf0c1 Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326) mapdata.county \uf0c1 Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.customlocations \uf0c1 Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.cwa \uf0c1 Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.firewxaor \uf0c1 Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.firewxzones \uf0c1 Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.fix \uf0c1 Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) mapdata.highaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.highsea \uf0c1 Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.highway \uf0c1 Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.hsa \uf0c1 Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.interstate \uf0c1 Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.isc \uf0c1 Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.lake \uf0c1 Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.latlon10 \uf0c1 Column Type the_geom geometry(MultiLineString,4326) mapdata.lowaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.majorrivers \uf0c1 Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326) mapdata.marinesites \uf0c1 Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326) mapdata.marinezones \uf0c1 Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.mexico \uf0c1 Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326) mapdata.navaid \uf0c1 Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326) mapdata.offshore \uf0c1 Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326) mapdata.railroad \uf0c1 Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326) mapdata.rfc \uf0c1 Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.specialuse \uf0c1 Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.states \uf0c1 Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.timezones \uf0c1 Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.warngenloc \uf0c1 Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326) mapdata.world \uf0c1 Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326) mapdata.zone \uf0c1 Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"Maps Database"},{"location":"python/maps-database/#mapdataairport","text":"Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok","title":"mapdata.airport"},{"location":"python/maps-database/#mapdataallrivers","text":"Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326)","title":"mapdata.allrivers"},{"location":"python/maps-database/#mapdataartcc","text":"Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.artcc"},{"location":"python/maps-database/#mapdatabasins","text":"Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326)","title":"mapdata.basins"},{"location":"python/maps-database/#mapdatacanada","text":"Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.canada"},{"location":"python/maps-database/#mapdatacity","text":"Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326)","title":"mapdata.city"},{"location":"python/maps-database/#mapdatacounty","text":"Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.county"},{"location":"python/maps-database/#mapdatacustomlocations","text":"Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.customlocations"},{"location":"python/maps-database/#mapdatacwa","text":"Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.cwa"},{"location":"python/maps-database/#mapdatafirewxaor","text":"Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxaor"},{"location":"python/maps-database/#mapdatafirewxzones","text":"Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxzones"},{"location":"python/maps-database/#mapdatafix","text":"Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.fix"},{"location":"python/maps-database/#mapdatahighaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.highaltitude"},{"location":"python/maps-database/#mapdatahighsea","text":"Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.highsea"},{"location":"python/maps-database/#mapdatahighway","text":"Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.highway"},{"location":"python/maps-database/#mapdatahsa","text":"Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.hsa"},{"location":"python/maps-database/#mapdatainterstate","text":"Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.interstate"},{"location":"python/maps-database/#mapdataisc","text":"Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.isc"},{"location":"python/maps-database/#mapdatalake","text":"Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.lake"},{"location":"python/maps-database/#mapdatalatlon10","text":"Column Type the_geom geometry(MultiLineString,4326)","title":"mapdata.latlon10"},{"location":"python/maps-database/#mapdatalowaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.lowaltitude"},{"location":"python/maps-database/#mapdatamajorrivers","text":"Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326)","title":"mapdata.majorrivers"},{"location":"python/maps-database/#mapdatamarinesites","text":"Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326)","title":"mapdata.marinesites"},{"location":"python/maps-database/#mapdatamarinezones","text":"Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.marinezones"},{"location":"python/maps-database/#mapdatamexico","text":"Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326)","title":"mapdata.mexico"},{"location":"python/maps-database/#mapdatanavaid","text":"Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.navaid"},{"location":"python/maps-database/#mapdataoffshore","text":"Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326)","title":"mapdata.offshore"},{"location":"python/maps-database/#mapdatarailroad","text":"Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326)","title":"mapdata.railroad"},{"location":"python/maps-database/#mapdatarfc","text":"Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.rfc"},{"location":"python/maps-database/#mapdataspecialuse","text":"Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.specialuse"},{"location":"python/maps-database/#mapdatastates","text":"Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.states"},{"location":"python/maps-database/#mapdatatimezones","text":"Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.timezones"},{"location":"python/maps-database/#mapdatawarngenloc","text":"Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326)","title":"mapdata.warngenloc"},{"location":"python/maps-database/#mapdataworld","text":"Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326)","title":"mapdata.world"},{"location":"python/maps-database/#mapdatazone","text":"Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.zone"},{"location":"python/model-sounding-data/","text":"The EDEX modelsounding plugin creates 64-level vertical profiles from GFS and ETA (NAM) BUFR products distirubted over NOAAport. Paramters which are requestable are pressure , temperature , specHum , uComp , vComp , omega , cldCvr . from awips.dataaccess import DataAccessLayer import matplotlib.tri as mtri import matplotlib.pyplot as plt from mpl_toolkits.axes_grid1.inset_locator import inset_axes from math import exp, log import numpy as np DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest() request.setDatatype(\"modelsounding\") forecastModel = \"GFS\" request.addIdentifier(\"reportType\", forecastModel) request.setParameters(\"pressure\",\"temperature\",\"specHum\",\"uComp\",\"vComp\",\"omega\",\"cldCvr\") locations = DataAccessLayer.getAvailableLocationNames(request) locations.sort() list(locations) ['CHE', 'CRL', 'EAX', 'HSI', 'KDSM', 'KFOE', 'KFRM', 'KFSD', 'KGRI', 'KLNK', 'KMCI', 'KMCW', 'KMHE', 'KMHK', 'KMKC', 'KOFK', 'KOMA', 'KRSL', 'KSLN', 'KSTJ', 'KSUX', 'KTOP', 'KYKN', 'OAX', 'P#8', 'P#9', 'P#A', 'P#G', 'P#I', 'RDD', 'WSC'] request.setLocationNames(\"KMCI\") cycles = DataAccessLayer.getAvailableTimes(request, True) times = DataAccessLayer.getAvailableTimes(request) try: fcstRun = DataAccessLayer.getForecastRun(cycles[-1], times) list(fcstRun) response = DataAccessLayer.getGeometryData(request,[fcstRun[0]]) except: print('No times available') exit tmp,prs,sh = np.array([]),np.array([]),np.array([]) uc,vc,om,cld = np.array([]),np.array([]),np.array([]),np.array([]) for ob in response: tmp = np.append(tmp,ob.getNumber(\"temperature\")) prs = np.append(prs,ob.getNumber(\"pressure\")) sh = np.append(sh,ob.getNumber(\"specHum\")) uc = np.append(uc,ob.getNumber(\"uComp\")) vc = np.append(vc,ob.getNumber(\"vComp\")) om = np.append(om,ob.getNumber(\"omega\")) cld = np.append(cld,ob.getNumber(\"cldCvr\")) print(\"parms = \" + str(ob.getParameters())) print(\"site = \" + str(ob.getLocationName())) print(\"geom = \" + str(ob.getGeometry())) print(\"datetime = \" + str(ob.getDataTime())) print(\"reftime = \" + str(ob.getDataTime().getRefTime())) print(\"fcstHour = \" + str(ob.getDataTime().getFcstTime())) print(\"period = \" + str(ob.getDataTime().getValidPeriod())) sounding_title = forecastModel + \" \" + str(ob.getLocationName()) + \"(\"+ str(ob.getGeometry())+\")\" + str(ob.getDataTime()) parms = ['uComp', 'cldCvr', 'temperature', 'vComp', 'pressure', 'omega', 'specHum'] site = KMCI geom = POINT (-94.72000122070312 39.31999969482422) datetime = 1970-01-18 04:45:50.400000 (0) reftime = Jan 18 70 04:45:50 GMT fcstHour = 0 period = (Jan 18 70 04:45:50 , Jan 18 70 04:45:50 ) Create data arrays and calculate dewpoint from spec. humidity \uf0c1 from metpy.calc import get_wind_components, lcl, dry_lapse, parcel_profile, dewpoint from metpy.calc import get_wind_speed,get_wind_dir, thermo, vapor_pressure from metpy.plots import SkewT, Hodograph from metpy.units import units, concatenate # we can use units.* here... t = (tmp-273.15) * units.degC p = prs/100 * units.mbar u,v = uc*1.94384,vc*1.94384 # m/s to knots spd = get_wind_speed(u, v) * units.knots dir = get_wind_dir(u, v) * units.deg Dewpoint from Specific Humidity \uf0c1 Because the modelsounding plugin does not return dewpoint values, we must calculate the profile ourselves. Here are three examples of dewpoint calculated from specific humidity, including a manual calculation following NCEP AWIPS/NSHARP. 1) metpy calculated mixing ratio and vapor pressure \uf0c1 from metpy.calc import get_wind_components, lcl, dry_lapse, parcel_profile, dewpoint from metpy.calc import get_wind_speed,get_wind_dir, thermo, vapor_pressure from metpy.plots import SkewT, Hodograph from metpy.units import units, concatenate # we can use units.* here... t = (tmp-273.15) * units.degC p = prs/100 * units.mbar u,v = uc*1.94384,vc*1.94384 # m/s to knots spd = get_wind_speed(u, v) * units.knots dir = get_wind_dir(u, v) * units.deg rmix = (sh/(1-sh)) *1000 * units('g/kg') e = vapor_pressure(p, rmix) td = dewpoint(e) /Users/mj/miniconda2/lib/python2.7/site-packages/MetPy-0.3.0+34.gcf954c5-py2.7.egg/metpy/calc/thermo.py:371: RuntimeWarning: divide by zero encountered in log val = np.log(e / sat_pressure_0c) /Users/mj/miniconda2/lib/python2.7/site-packages/pint/quantity.py:1236: RuntimeWarning: divide by zero encountered in log out = uf(*mobjs) /Users/mj/miniconda2/lib/python2.7/site-packages/pint/quantity.py:693: RuntimeWarning: invalid value encountered in true_divide magnitude = magnitude_op(self._magnitude, other_magnitude) 2) metpy calculated assuming spec. humidity = mixing ratio \uf0c1 td2 = dewpoint(vapor_pressure(p, sh)) 3) NCEP AWIPS soundingrequest plugin \uf0c1 based on GEMPAK/NSHARP, from https://github.com/Unidata/awips2-ncep/blob/unidata_16.2.2/edex/gov.noaa.nws.ncep.edex.plugin.soundingrequest/src/gov/noaa/nws/ncep/edex/plugin/soundingrequest/handler/MergeSounding.java#L1783 # new arrays ntmp = tmp # where p=pressure(pa), T=temp(C), T0=reference temp(273.16) rh = 0.263*prs*sh / (np.exp(17.67*ntmp/(ntmp+273.15-29.65))) vaps = 6.112 * np.exp((17.67 * ntmp) / (ntmp + 243.5)) vapr = rh * vaps / 100 dwpc = np.array(243.5 * (np.log(6.112) - np.log(vapr)) / (np.log(vapr) - np.log(6.112) - 17.67)) * units.degC /Users/mj/miniconda2/lib/python2.7/site-packages/ipykernel/__main__.py:8: RuntimeWarning: divide by zero encountered in log /Users/mj/miniconda2/lib/python2.7/site-packages/ipykernel/__main__.py:8: RuntimeWarning: invalid value encountered in divide Plot with MetPy \uf0c1 %matplotlib inline plt.rcParams['figure.figsize'] = (12, 14) # Create a skewT plot skew = SkewT() # Plot the data skew.plot(p, t, 'r', linewidth=2) skew.plot(p, td, 'b', linewidth=2) skew.plot(p, td2, 'y') skew.plot(p, dwpc, 'g', linewidth=2) skew.plot_barbs(p, u, v) skew.ax.set_ylim(1000, 100) skew.ax.set_xlim(-40, 60) plt.title(sounding_title) # Calculate LCL height and plot as black dot l = lcl(p[0], t[0], td[0]) lcl_temp = dry_lapse(concatenate((p[0], l)), t[0])[-1].to('degC') skew.plot(l, lcl_temp, 'ko', markerfacecolor='black') # An example of a slanted line at constant T -- in this case the 0 isotherm l = skew.ax.axvline(0, color='c', linestyle='--', linewidth=2) # Draw hodograph ax_hod = inset_axes(skew.ax, '40%', '40%', loc=2) h = Hodograph(ax_hod, component_range=get_wind_speed(u, v).max()) h.add_grid(increment=20) h.plot_colormapped(u, v, spd) # Show the plot plt.show()","title":"Model sounding data"},{"location":"python/model-sounding-data/#create-data-arrays-and-calculate-dewpoint-from-spec-humidity","text":"from metpy.calc import get_wind_components, lcl, dry_lapse, parcel_profile, dewpoint from metpy.calc import get_wind_speed,get_wind_dir, thermo, vapor_pressure from metpy.plots import SkewT, Hodograph from metpy.units import units, concatenate # we can use units.* here... t = (tmp-273.15) * units.degC p = prs/100 * units.mbar u,v = uc*1.94384,vc*1.94384 # m/s to knots spd = get_wind_speed(u, v) * units.knots dir = get_wind_dir(u, v) * units.deg","title":"Create data arrays and calculate dewpoint from spec. humidity"},{"location":"python/model-sounding-data/#dewpoint-from-specific-humidity","text":"Because the modelsounding plugin does not return dewpoint values, we must calculate the profile ourselves. Here are three examples of dewpoint calculated from specific humidity, including a manual calculation following NCEP AWIPS/NSHARP.","title":"Dewpoint from Specific Humidity"},{"location":"python/model-sounding-data/#1-metpy-calculated-mixing-ratio-and-vapor-pressure","text":"from metpy.calc import get_wind_components, lcl, dry_lapse, parcel_profile, dewpoint from metpy.calc import get_wind_speed,get_wind_dir, thermo, vapor_pressure from metpy.plots import SkewT, Hodograph from metpy.units import units, concatenate # we can use units.* here... t = (tmp-273.15) * units.degC p = prs/100 * units.mbar u,v = uc*1.94384,vc*1.94384 # m/s to knots spd = get_wind_speed(u, v) * units.knots dir = get_wind_dir(u, v) * units.deg rmix = (sh/(1-sh)) *1000 * units('g/kg') e = vapor_pressure(p, rmix) td = dewpoint(e) /Users/mj/miniconda2/lib/python2.7/site-packages/MetPy-0.3.0+34.gcf954c5-py2.7.egg/metpy/calc/thermo.py:371: RuntimeWarning: divide by zero encountered in log val = np.log(e / sat_pressure_0c) /Users/mj/miniconda2/lib/python2.7/site-packages/pint/quantity.py:1236: RuntimeWarning: divide by zero encountered in log out = uf(*mobjs) /Users/mj/miniconda2/lib/python2.7/site-packages/pint/quantity.py:693: RuntimeWarning: invalid value encountered in true_divide magnitude = magnitude_op(self._magnitude, other_magnitude)","title":"1) metpy calculated mixing ratio and vapor pressure"},{"location":"python/model-sounding-data/#2-metpy-calculated-assuming-spec-humidity-mixing-ratio","text":"td2 = dewpoint(vapor_pressure(p, sh))","title":"2) metpy calculated assuming spec. humidity = mixing ratio"},{"location":"python/model-sounding-data/#3-ncep-awips-soundingrequest-plugin","text":"based on GEMPAK/NSHARP, from https://github.com/Unidata/awips2-ncep/blob/unidata_16.2.2/edex/gov.noaa.nws.ncep.edex.plugin.soundingrequest/src/gov/noaa/nws/ncep/edex/plugin/soundingrequest/handler/MergeSounding.java#L1783 # new arrays ntmp = tmp # where p=pressure(pa), T=temp(C), T0=reference temp(273.16) rh = 0.263*prs*sh / (np.exp(17.67*ntmp/(ntmp+273.15-29.65))) vaps = 6.112 * np.exp((17.67 * ntmp) / (ntmp + 243.5)) vapr = rh * vaps / 100 dwpc = np.array(243.5 * (np.log(6.112) - np.log(vapr)) / (np.log(vapr) - np.log(6.112) - 17.67)) * units.degC /Users/mj/miniconda2/lib/python2.7/site-packages/ipykernel/__main__.py:8: RuntimeWarning: divide by zero encountered in log /Users/mj/miniconda2/lib/python2.7/site-packages/ipykernel/__main__.py:8: RuntimeWarning: invalid value encountered in divide","title":"3) NCEP AWIPS soundingrequest plugin"},{"location":"python/model-sounding-data/#plot-with-metpy","text":"%matplotlib inline plt.rcParams['figure.figsize'] = (12, 14) # Create a skewT plot skew = SkewT() # Plot the data skew.plot(p, t, 'r', linewidth=2) skew.plot(p, td, 'b', linewidth=2) skew.plot(p, td2, 'y') skew.plot(p, dwpc, 'g', linewidth=2) skew.plot_barbs(p, u, v) skew.ax.set_ylim(1000, 100) skew.ax.set_xlim(-40, 60) plt.title(sounding_title) # Calculate LCL height and plot as black dot l = lcl(p[0], t[0], td[0]) lcl_temp = dry_lapse(concatenate((p[0], l)), t[0])[-1].to('degC') skew.plot(l, lcl_temp, 'ko', markerfacecolor='black') # An example of a slanted line at constant T -- in this case the 0 isotherm l = skew.ax.axvline(0, color='c', linestyle='--', linewidth=2) # Draw hodograph ax_hod = inset_axes(skew.ax, '40%', '40%', loc=2) h = Hodograph(ax_hod, component_range=get_wind_speed(u, v).max()) h.add_grid(increment=20) h.plot_colormapped(u, v, spd) # Show the plot plt.show()","title":"Plot with MetPy"},{"location":"python/nexrad-level-3-radar/","text":"Shown here are plots for Base Reflectivity (N0Q, 94) and Base Velocity (N0U, 99) using AWIPS data rendered with Matplotlib, Cartopy, and MetPy. This example improves upon existing Level 3 Python rendering by doing the following: Display scaled and labeled colorbar below each figure. Plot radar radial images as coordinate maps in Cartopy and label with lat/lon. 8 bit Z and V colormap and data scaling added to MetPy from operational AWIPS. Level 3 data are retrieved from the Unidata EDEX Cloud server ( edex-cloud.unidata.ucar.edu ) Raw HDF5 byte data are converted to product values and scaled according to (page 3-34 https://www.roc.noaa.gov/wsr88d/PublicDocs/ICDS/2620001U.pdf) The threshold level fields are used to describe (up to) 256 levels as follows: halfword 31 contains the minimum data value in m/s*10 (or dBZ*10) halfword 32 contains the increment in m/s*10 (or dBZ*10) halfword 33 contains the number of levels (0 - 255) According to the ICD for the Product Specification , \"the 256 data levels of the digital product cover a range of reflectivity between -32.0 to +94.5 dBZ, in increments of 0.5 dBZ. Level codes 0 and 1 correspond to 'Below Threshold' and 'Range Folded', respectively, while level codes 2 through 255 correspond to the reflectivity data itself\" . So it's really 254 color values between -32 and +94.5 dBZ. The ICD lists 16 specific color levels and directs 256-level reflectivity products to use corresponding colors, leaving it the rendering application to scale and blend between the 16 color values, and to make decisions about discrete color changes, apparently. For AWIPS, the National Weather Service uses a mostly-blended color scale with a discrete jump to red at reflectivity values of 50 dBZ: 50 dBZ corresponds to the 16-level color light red ( FF6060 ). Note that FF6060 is not used in the NWS AWIPS color scale, instead RGB value is given as 255,0,0 (hex code FF0000 ). 60 dBZ is not quite exactly where white starts, but it makes sense that it would. Obviously the AWIPS D2D authors took some liberties with their 256-level rendering, not adhering strictly to \"dark red\" for dBZ values between 60-65 (white was for 70 dBZ and above on the 16-level colormap). For this exercise we will assume 50 dBZ should be red and 60 dBZ white, and 75 dBZ cyan. Setup \uf0c1 pip install python-awips matplotlib cartopy metpy Python Script \uf0c1 Download this script as a Jupyter Notebook . from awips.dataaccess import DataAccessLayer from awips import ThriftClient, RadarCommon from dynamicserialize.dstypes.com.raytheon.uf.common.time import TimeRange from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.radar.request import GetRadarDataRecordRequest from datetime import datetime from datetime import timedelta import matplotlib.pyplot as plt import numpy as np from numpy import ma from metpy.plots import ctables import cartopy.crs as ccrs from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER # set EDEX server and radar site definitions site = 'kmux' DataAccessLayer.changeEDEXHost('edex-cloud.unidata.ucar.edu') request = DataAccessLayer.newDataRequest() request.setDatatype('radar') request.setLocationNames(site) # Get latest time for site datatimes = DataAccessLayer.getAvailableTimes(request) dateTimeStr = str(datatimes[-1]) dateTimeStr = \"2017-02-02 03:53:03\" buffer = 60 # seconds dateTime = datetime.strptime(dateTimeStr, '%Y-%m-%d %H:%M:%S') # Build timerange +/- buffer beginRange = dateTime - timedelta(0, buffer) endRange = dateTime + timedelta(0, buffer) timerange = TimeRange(beginRange, endRange) # GetRadarDataRecordRequest to query site with timerange client = ThriftClient.ThriftClient('edex-cloud.unidata.ucar.edu') request = GetRadarDataRecordRequest() request.setTimeRange(timerange) request.setRadarId(site) # Map config def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(12, 12), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax # ctable defines the colortable, beginning value, data increment # * For N0Q the scale is -20 to +75 dBZ in increments of 0.5 dBZ # * For N0U the scale is -100 to +100 kts in increments of 1 kt nexrad = {} nexrad[\"N0Q\"] = { 'id': 94, 'unit':'dBZ', 'name':'0.5 deg Base Reflectivity', 'ctable': ['NWSStormClearReflectivity',-20., 0.5], 'res': 1000., 'elev': '0.5' } nexrad[\"N0U\"] = { 'id': 99, 'unit':'kts', 'name':'0.5 deg Base Velocity', 'ctable': ['NWS8bitVel',-100.,1.], 'res': 250., 'elev': '0.5' } grids = [] for code in nexrad: request.setProductCode(nexrad[code]['id']) request.setPrimaryElevationAngle(nexrad[code]['elev']) response = client.sendRequest(request) if response.getData(): for record in response.getData(): # Get record hdf5 data idra = record.getHdf5Data() rdat,azdat,depVals,threshVals = RadarCommon.get_hdf5_data(idra) dim = rdat.getDimension() lat,lon = float(record.getLatitude()),float(record.getLongitude()) radials,rangeGates = rdat.getSizes() # Convert raw byte to pixel value rawValue=np.array(rdat.getByteData()) array = [] for rec in rawValue: if rec<0: rec+=256 array.append(rec) if azdat: azVals = azdat.getFloatData() az = np.array(RadarCommon.encode_radial(azVals)) dattyp = RadarCommon.get_data_type(azdat) az = np.append(az,az[-1]) header = RadarCommon.get_header(record, format, rangeGates, radials, azdat, 'description') rng = np.linspace(0, rangeGates, rangeGates + 1) # Convert az/range to a lat/lon from pyproj import Geod g = Geod(ellps='clrk66') center_lat = np.ones([len(az),len(rng)])*lat center_lon = np.ones([len(az),len(rng)])*lon az2D = np.ones_like(center_lat)*az[:,None] rng2D = np.ones_like(center_lat)*np.transpose(rng[:,None])*nexrad[code]['res'] lons,lats,back=g.fwd(center_lon,center_lat,az2D,rng2D) bbox = [lons.min(), lons.max(), lats.min(), lats.max()] # Create 2d array multiArray = np.reshape(array, (-1, rangeGates)) data = ma.array(multiArray) # threshVals[0:2] contains halfwords 31,32,33 (min value, increment, num levels) data = ma.array(threshVals[0]/10. + (multiArray)*threshVals[1]/10.) if nexrad[code]['unit'] == 'kts': data[data<-63] = ma.masked data *= 1.94384 # Convert to knots else: data[data<=((threshVals[0]/10.)+threshVals[1]/10.)] = ma.masked # Save our requested grids so we can render them multiple times product = { \"code\": code, \"bbox\": bbox, \"lats\": lats, \"lons\": lons, \"data\": data } grids.append(product) print(\"Processed \"+str(len(grids))+\" grids.\") Processed 2 grids. Plot N0Q and N0U with Cartopy \uf0c1 for rec in grids: code = rec[\"code\"] bbox = rec[\"bbox\"] lats = rec[\"lats\"] lons = rec[\"lons\"] data = rec[\"data\"] # Create figure %matplotlib inline fig, ax = make_map(bbox=bbox) # Colortable filename, beginning value, increment ctable = nexrad[code]['ctable'][0] beg = nexrad[code]['ctable'][1] inc = nexrad[code]['ctable'][2] norm, cmap = ctables.registry.get_with_steps(ctable, beg, inc) cs = ax.pcolormesh(lons, lats, data, norm=norm, cmap=cmap) ax.set_aspect('equal', 'datalim') cbar = plt.colorbar(cs, extend='both', shrink=0.75, orientation='horizontal') cbar.set_label(site.upper()+\" \"+ str(nexrad[code]['res']/1000.) +\"km \" \\ +nexrad[code]['name']+\" (\"+code+\") \" \\ +nexrad[code]['unit']+\" \" \\ +str(record.getDataTime())) # Zoom to within +-2 deg of center ax.set_xlim(lon-2., lon+2.) ax.set_ylim(lat-2., lat+2.) plt.show() compare with the same product scan rendered in AWIPS CAVE (slightly different projections and still some color mapping differences, most noticeable in ground clutter). Two-panel plot, zoomed in \uf0c1 fig, axes = plt.subplots(ncols=2,figsize=(12,9), subplot_kw=dict(projection=ccrs.PlateCarree())) i=0 for rec,ax in zip(grids, axes): code = rec[\"code\"] bbox = rec[\"bbox\"] lats = rec[\"lats\"] lons = rec[\"lons\"] data = rec[\"data\"] # Create figure ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False if i>0: gl.ylabels_left = False # hide right-pane left axis label gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER # Colortable filename, beginning value, increment colorvals=nexrad[code]['ctable'] ctable = nexrad[code]['ctable'][0] beg = nexrad[code]['ctable'][1] inc = nexrad[code]['ctable'][2] norm, cmap = ctables.registry.get_with_steps(ctable, beg, inc) cs = ax.pcolormesh(lons, lats, data, norm=norm, cmap=cmap) ax.set_aspect('equal', 'datalim') cbar = fig.colorbar(cs, orientation='horizontal', ax=ax) cbar.set_label(site.upper()+\" \"+code+\" \"+nexrad[code]['unit']+\" \"+str(record.getDataTime())) plt.tight_layout() # Zoom ax.set_xlim(lon-.1, lon+.1) ax.set_ylim(lat-.1, lat+.1) i+=1 and again compared to CAVE","title":"Nexrad level 3 radar"},{"location":"python/nexrad-level-3-radar/#setup","text":"pip install python-awips matplotlib cartopy metpy","title":"Setup"},{"location":"python/nexrad-level-3-radar/#python-script","text":"Download this script as a Jupyter Notebook . from awips.dataaccess import DataAccessLayer from awips import ThriftClient, RadarCommon from dynamicserialize.dstypes.com.raytheon.uf.common.time import TimeRange from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.radar.request import GetRadarDataRecordRequest from datetime import datetime from datetime import timedelta import matplotlib.pyplot as plt import numpy as np from numpy import ma from metpy.plots import ctables import cartopy.crs as ccrs from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER # set EDEX server and radar site definitions site = 'kmux' DataAccessLayer.changeEDEXHost('edex-cloud.unidata.ucar.edu') request = DataAccessLayer.newDataRequest() request.setDatatype('radar') request.setLocationNames(site) # Get latest time for site datatimes = DataAccessLayer.getAvailableTimes(request) dateTimeStr = str(datatimes[-1]) dateTimeStr = \"2017-02-02 03:53:03\" buffer = 60 # seconds dateTime = datetime.strptime(dateTimeStr, '%Y-%m-%d %H:%M:%S') # Build timerange +/- buffer beginRange = dateTime - timedelta(0, buffer) endRange = dateTime + timedelta(0, buffer) timerange = TimeRange(beginRange, endRange) # GetRadarDataRecordRequest to query site with timerange client = ThriftClient.ThriftClient('edex-cloud.unidata.ucar.edu') request = GetRadarDataRecordRequest() request.setTimeRange(timerange) request.setRadarId(site) # Map config def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(12, 12), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax # ctable defines the colortable, beginning value, data increment # * For N0Q the scale is -20 to +75 dBZ in increments of 0.5 dBZ # * For N0U the scale is -100 to +100 kts in increments of 1 kt nexrad = {} nexrad[\"N0Q\"] = { 'id': 94, 'unit':'dBZ', 'name':'0.5 deg Base Reflectivity', 'ctable': ['NWSStormClearReflectivity',-20., 0.5], 'res': 1000., 'elev': '0.5' } nexrad[\"N0U\"] = { 'id': 99, 'unit':'kts', 'name':'0.5 deg Base Velocity', 'ctable': ['NWS8bitVel',-100.,1.], 'res': 250., 'elev': '0.5' } grids = [] for code in nexrad: request.setProductCode(nexrad[code]['id']) request.setPrimaryElevationAngle(nexrad[code]['elev']) response = client.sendRequest(request) if response.getData(): for record in response.getData(): # Get record hdf5 data idra = record.getHdf5Data() rdat,azdat,depVals,threshVals = RadarCommon.get_hdf5_data(idra) dim = rdat.getDimension() lat,lon = float(record.getLatitude()),float(record.getLongitude()) radials,rangeGates = rdat.getSizes() # Convert raw byte to pixel value rawValue=np.array(rdat.getByteData()) array = [] for rec in rawValue: if rec<0: rec+=256 array.append(rec) if azdat: azVals = azdat.getFloatData() az = np.array(RadarCommon.encode_radial(azVals)) dattyp = RadarCommon.get_data_type(azdat) az = np.append(az,az[-1]) header = RadarCommon.get_header(record, format, rangeGates, radials, azdat, 'description') rng = np.linspace(0, rangeGates, rangeGates + 1) # Convert az/range to a lat/lon from pyproj import Geod g = Geod(ellps='clrk66') center_lat = np.ones([len(az),len(rng)])*lat center_lon = np.ones([len(az),len(rng)])*lon az2D = np.ones_like(center_lat)*az[:,None] rng2D = np.ones_like(center_lat)*np.transpose(rng[:,None])*nexrad[code]['res'] lons,lats,back=g.fwd(center_lon,center_lat,az2D,rng2D) bbox = [lons.min(), lons.max(), lats.min(), lats.max()] # Create 2d array multiArray = np.reshape(array, (-1, rangeGates)) data = ma.array(multiArray) # threshVals[0:2] contains halfwords 31,32,33 (min value, increment, num levels) data = ma.array(threshVals[0]/10. + (multiArray)*threshVals[1]/10.) if nexrad[code]['unit'] == 'kts': data[data<-63] = ma.masked data *= 1.94384 # Convert to knots else: data[data<=((threshVals[0]/10.)+threshVals[1]/10.)] = ma.masked # Save our requested grids so we can render them multiple times product = { \"code\": code, \"bbox\": bbox, \"lats\": lats, \"lons\": lons, \"data\": data } grids.append(product) print(\"Processed \"+str(len(grids))+\" grids.\") Processed 2 grids.","title":"Python Script"},{"location":"python/nexrad-level-3-radar/#plot-n0q-and-n0u-with-cartopy","text":"for rec in grids: code = rec[\"code\"] bbox = rec[\"bbox\"] lats = rec[\"lats\"] lons = rec[\"lons\"] data = rec[\"data\"] # Create figure %matplotlib inline fig, ax = make_map(bbox=bbox) # Colortable filename, beginning value, increment ctable = nexrad[code]['ctable'][0] beg = nexrad[code]['ctable'][1] inc = nexrad[code]['ctable'][2] norm, cmap = ctables.registry.get_with_steps(ctable, beg, inc) cs = ax.pcolormesh(lons, lats, data, norm=norm, cmap=cmap) ax.set_aspect('equal', 'datalim') cbar = plt.colorbar(cs, extend='both', shrink=0.75, orientation='horizontal') cbar.set_label(site.upper()+\" \"+ str(nexrad[code]['res']/1000.) +\"km \" \\ +nexrad[code]['name']+\" (\"+code+\") \" \\ +nexrad[code]['unit']+\" \" \\ +str(record.getDataTime())) # Zoom to within +-2 deg of center ax.set_xlim(lon-2., lon+2.) ax.set_ylim(lat-2., lat+2.) plt.show() compare with the same product scan rendered in AWIPS CAVE (slightly different projections and still some color mapping differences, most noticeable in ground clutter).","title":"Plot N0Q and N0U with Cartopy"},{"location":"python/nexrad-level-3-radar/#two-panel-plot-zoomed-in","text":"fig, axes = plt.subplots(ncols=2,figsize=(12,9), subplot_kw=dict(projection=ccrs.PlateCarree())) i=0 for rec,ax in zip(grids, axes): code = rec[\"code\"] bbox = rec[\"bbox\"] lats = rec[\"lats\"] lons = rec[\"lons\"] data = rec[\"data\"] # Create figure ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False if i>0: gl.ylabels_left = False # hide right-pane left axis label gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER # Colortable filename, beginning value, increment colorvals=nexrad[code]['ctable'] ctable = nexrad[code]['ctable'][0] beg = nexrad[code]['ctable'][1] inc = nexrad[code]['ctable'][2] norm, cmap = ctables.registry.get_with_steps(ctable, beg, inc) cs = ax.pcolormesh(lons, lats, data, norm=norm, cmap=cmap) ax.set_aspect('equal', 'datalim') cbar = fig.colorbar(cs, orientation='horizontal', ax=ax) cbar.set_label(site.upper()+\" \"+code+\" \"+nexrad[code]['unit']+\" \"+str(record.getDataTime())) plt.tight_layout() # Zoom ax.set_xlim(lon-.1, lon+.1) ax.set_ylim(lat-.1, lat+.1) i+=1 and again compared to CAVE","title":"Two-panel plot, zoomed in"},{"location":"python/overview/","text":"Python-AWIPS \uf0c1 The python-awips package provides a data access framework for requesting meteorological and related datasets from an EDEX server. As with any python code, python-awips can be used to interact in command line, or scripting form, with the EDEX. This is an alternative to using CAVE to interact with the data. Thorough documentation for python-awips can be found here .","title":"Python-AWIPS"},{"location":"python/overview/#python-awips","text":"The python-awips package provides a data access framework for requesting meteorological and related datasets from an EDEX server. As with any python code, python-awips can be used to interact in command line, or scripting form, with the EDEX. This is an alternative to using CAVE to interact with the data. Thorough documentation for python-awips can be found here .","title":"Python-AWIPS"},{"location":"python/python-awips-data-access/","text":"The python-awips package provides a data access framework for requesting grid and geometry datasets from an EDEX server. For a more detailed look at the python-awips package, refer to the full documentation site which includes a number of plotting examples for different data types . Install \uf0c1 pip install python-awips Requirements \uf0c1 Python 2.7+ Numpy 1.7+ Shapely 1.4+ MetPy and enum34 to run Jupyter Notebook examples Example \uf0c1 This examples covers the callable methods of the Python AWIPS DAF when working with gridded data. We start with a connection to an EDEX server, then query data types, then grid names, parameters, levels, and other information. Finally the gridded data is plotted for its domain using Matplotlib and Cartopy. DataAccessLayer.changeEDEXHost() \uf0c1 After DataAccessLayer is imported from the package awips.dataaccess , the first step is to define the EDEX data server hostname ( edex-cloud.unidata.ucar.edu for these examples) from awips.dataaccess import DataAccessLayer DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") DataAccessLayer.getSupportedDatatypes() \uf0c1 getSupportedDatatypes() returns a list of available data types offered by the EDEX server defined above. dataTypes = DataAccessLayer.getSupportedDatatypes() list(dataTypes) ['acars', 'airep', 'binlightning', 'bufrmosavn', 'bufrmoseta', 'bufrmosgfs', 'bufrmoshpc', 'bufrmoslamp', 'bufrmosmrf', 'bufrua', 'climate', 'common_obs_spatial', 'ffmp', 'gfe', 'grid', 'hydro', 'maps', 'modelsounding', 'obs', 'pirep', 'practicewarning', 'profiler', 'radar', 'radar_spatial', 'satellite', 'sfcobs', 'topo', 'warning'] DataAccessLayer.newDataRequest() \uf0c1 Now create a new data request, and set the data type to grid and \"locationName\" to RAP40 with setDataType() and setLocationNames() request = DataAccessLayer.newDataRequest() request.setDatatype(\"grid\") DataAccessLayer.getAvailableLocationNames() \uf0c1 With datatype set to \"grid\", we can query all available grid names with getAvailableLocationNames() available_grids = DataAccessLayer.getAvailableLocationNames(request) available_grids.sort() list(available_grids) ['CMC', 'EKDMOS', 'EKDMOS-AK', 'ESTOFS', 'ETSS', 'FFG-ALR', 'FFG-FWR', 'FFG-KRF', 'FFG-MSR', 'FFG-ORN', 'FFG-PTR', 'FFG-RHA', 'FFG-RSA', 'FFG-STR', 'FFG-TAR', 'FFG-TIR', 'FFG-TUA', 'FNMOC-FAROP', 'FNMOC-NCODA', 'FNMOC-WW3', 'FNMOC-WW3-Europe', 'GFS', 'GFS20', 'GFSLAMP5', 'GribModel:9:151:172', 'HFR-EAST_6KM', 'HFR-EAST_PR_6KM', 'HFR-US_EAST_DELAWARE_1KM', 'HFR-US_EAST_FLORIDA_2KM', 'HFR-US_EAST_NORTH_2KM', 'HFR-US_EAST_SOUTH_2KM', 'HFR-US_EAST_VIRGINIA_1KM', 'HFR-US_HAWAII_1KM', 'HFR-US_HAWAII_2KM', 'HFR-US_HAWAII_6KM', 'HFR-US_WEST_500M', 'HFR-US_WEST_CENCAL_2KM', 'HFR-US_WEST_LOSANGELES_1KM', 'HFR-US_WEST_LOSOSOS_1KM', 'HFR-US_WEST_NORTH_2KM', 'HFR-US_WEST_SANFRAN_1KM', 'HFR-US_WEST_SOCAL_2KM', 'HFR-US_WEST_WASHINGTON_1KM', 'HFR-WEST_6KM', 'HPCGuide', 'HPCqpf', 'HPCqpfNDFD', 'HRRR', 'LAMP2p5', 'MOSGuide', 'MPE-Local-ALR', 'MPE-Local-FWR', 'MPE-Local-MSR', 'MPE-Local-ORN', 'MPE-Local-RHA', 'MPE-Local-SJU', 'MPE-Local-STR', 'MPE-Local-TAR', 'MPE-Local-TIR', 'MPE-Mosaic-ALR', 'MPE-Mosaic-FWR', 'MPE-Mosaic-MSR', 'MPE-Mosaic-ORN', 'MPE-Mosaic-RHA', 'MPE-Mosaic-SJU', 'MPE-Mosaic-TAR', 'MPE-Mosaic-TIR', 'NAM12', 'NAM40', 'NAVGEM', 'NCWF', 'NDFD', 'NOHRSC-SNOW', 'NamDNG', 'PROB3HR', 'QPE-ALR', 'QPE-Auto-TUA', 'QPE-FWR', 'QPE-KRF', 'QPE-MSR', 'QPE-Manual-KRF', 'QPE-ORN', 'QPE-RFC-PTR', 'QPE-RFC-RSA', 'QPE-RFC-STR', 'QPE-TIR', 'QPE-TUA', 'QPE-XNAV-ALR', 'QPE-XNAV-FWR', 'QPE-XNAV-KRF', 'QPE-XNAV-MSR', 'QPE-XNAV-ORN', 'QPE-XNAV-SJU', 'QPE-XNAV-TAR', 'QPE-XNAV-TIR', 'QPE-XNAV-TUA', 'RAP13', 'RAP20', 'RAP40', 'RFCqpf', 'RTMA', 'RTOFS-Now-WestAtl', 'RTOFS-Now-WestConus', 'RTOFS-WestAtl', 'RTOFS-WestConus', 'SeaIce', 'TPCWindProb', 'UKMET-MODEL1', 'URMA25', 'WaveWatch'] Set grid name with setLocationNames() \uf0c1 request.setLocationNames(\"RAP13\") List Available Parameters for a Grid \uf0c1 DataAccessLayer.getAvailableParameters() \uf0c1 After datatype and model name (locationName) are set, you can query all available parameters with getAvailableParameters() availableParms = DataAccessLayer.getAvailableParameters(request) availableParms.sort() list(availableParms) ['0to5', '2xTP6hr', 'AV', 'Along', 'AppT', 'BLI', 'BlkMag', 'BlkShr', 'CAPE', 'CFRZR', 'CICEP', 'CIn', 'CP', 'CP1hr', 'CPr', 'CPrD', 'CRAIN', 'CSNOW', 'CURU', 'CXR', 'CapeStk', 'Corf', 'CorfF', 'CorfFM', 'CorfM', 'CritT1', 'DIABi', 'DivF', 'DivFn', 'DivFs', 'DpD', 'DpDt', 'DpT', 'Dpress', 'DthDt', 'EHI', 'EHI01', 'EHIi', 'EPT', 'EPTA', 'EPTC', 'EPTGrd', 'EPTGrdM', 'EPTs', 'EPVg', 'EPVs', 'EPVt1', 'EPVt2', 'FVecs', 'FeatMot', 'FnVecs', 'FsVecs', 'Fzra1', 'Fzra2', 'GH', 'GHxSM', 'GHxSM2', 'Gust', 'HI', 'HI1', 'HI3', 'HI4', 'HIdx', 'HPBL', 'Heli', 'Into', 'KI', 'L-I', 'LIsfc2x', 'LgSP1hr', 'MAdv', 'MCon', 'MCon2', 'MMSP', 'MSFDi', 'MSFi', 'MSFmi', 'MSG', 'MTV', 'Mix1', 'Mix2', 'Mmag', 'MnT', 'MpV', 'MxT', 'NBE', 'NetIO', 'OmDiff', 'P', 'PAdv', 'PBE', 'PFrnt', 'PGrd', 'PGrd1', 'PGrdM', 'PIVA', 'PR', 'PTvA', 'PTyp', 'PVV', 'PW', 'PW2', 'PoT', 'PoTA', 'QPV1', 'QPV2', 'QPV3', 'QPV4', 'REFC', 'RH', 'RH_001_bin', 'RH_002_bin', 'RM5', 'RMGH2', 'RRtype', 'RV', 'Rain1', 'Rain2', 'Rain3', 'Ro', 'SH', 'SHx', 'SLI', 'SNW', 'SNWA', 'SRMm', 'SRMmM', 'SSi', 'Shear', 'ShrMag', 'SnD', 'Snow1', 'Snow2', 'Snow3', 'SnowT', 'St-Pr', 'St-Pr1hr', 'StrTP', 'StrmMot', 'T', 'TAdv', 'TGrd', 'TGrdM', 'TP', 'TP12hr', 'TP168hr', 'TP1hr', 'TP24hr', 'TP36hr', 'TP3hr', 'TP48hr', 'TP6hr', 'TP72hr', 'TPrun', 'TPx12x6', 'TPx1x3', 'TQIND', 'TV', 'TW', 'T_001_bin', 'Tdef', 'Tdend', 'ThGrd', 'TmDpD', 'Tmax', 'Tmin', 'TotQi', 'Tstk', 'TwMax', 'TwMin', 'Twstk', 'TxSM', 'USTM', 'VAdv', 'VAdvAdvection', 'VSTM', 'Vis', 'WD', 'WEASD', 'WEASD1hr', 'WGS', 'Wind', 'WndChl', 'ageoVC', 'ageoW', 'ageoWM', 'cCape', 'cCin', 'cTOT', 'capeToLvl', 'dCape', 'dGH12', 'dP', 'dP1hr', 'dP3hr', 'dP6hr', 'dPW1hr', 'dPW3hr', 'dPW6hr', 'dT', 'dVAdv', 'dZ', 'defV', 'del2gH', 'df', 'fGen', 'fnD', 'fsD', 'gamma', 'gammaE', 'geoVort', 'geoW', 'geoWM', 'mixRat', 'msl-P', 'muCape', 'pV', 'pVeq', 'qDiv', 'qVec', 'qnVec', 'qsVec', 'shWlt', 'snoRatCrocus', 'snoRatEMCSREF', 'snoRatSPC', 'snoRatSPCdeep', 'snoRatSPCsurface', 'swtIdx', 'tTOT', 'tWind', 'tWindU', 'tWindV', 'uFX', 'uW', 'vSmthW', 'vTOT', 'vW', 'vertCirc', 'wDiv', 'wSp', 'wSp_001_bin', 'wSp_002_bin', 'wSp_003_bin', 'wSp_004_bin', 'zAGL'] setParameters() \uf0c1 set the request parameter request.setParameters(\"T\") List Available Levels for Parameter \uf0c1 Using DataAccessLayer.getAvailableLevels() availableLevels = DataAccessLayer.getAvailableLevels(request) for level in availableLevels: print(level) 0.0SFC 350.0MB 475.0MB 225.0MB 120.0_150.0BL 900.0MB 125.0MB 450.0MB 575.0MB 325.0MB 100.0MB 1000.0MB 60.0_90.0BL 275.0MB 1.0PV 950.0MB 150.0MB 1.5PV 700.0MB 825.0MB 150.0_180.0BL 250.0MB 375.0MB 1000.0_500.0MB 800.0MB 925.0MB 2.0PV 0.5PV 0.0TROP 750.0MB 500.0MB 625.0MB 400.0MB 0.0FHAG 2.0FHAG 875.0MB 175.0MB 850.0MB 600.0MB 725.0MB 975.0MB 550.0MB 675.0MB 425.0MB 200.0MB 0.0_30.0BL 30.0_60.0BL 650.0MB 525.0MB 300.0MB 90.0_120.0BL 775.0MB 0.0TILT 0.5TILT 340.0_350.0K 290.0_300.0K 700.0_600.0MB 700.0_300.0MB 320.0Ke 800.0_750.0MB 60.0TILT 5.3TILT 1000.0_900.0MB 340.0K 255.0K 255.0_265.0K 25.0TILT 1000.0_850.0MB 850.0_250.0MB 280.0_290.0Ke 320.0_330.0K 310.0_320.0Ke 310.0Ke 330.0K 900.0_800.0MB 550.0_500.0MB 2.4TILT 50.0TILT 35.0TILT 12.0TILT 300.0_310.0K 0.9TILT 320.0K 400.0_350.0MB 750.0_700.0MB 345.0K 250.0_260.0K 300.0Ke 290.0Ke 950.0_900.0MB 275.0_285.0Ke 335.0Ke 295.0_305.0Ke 275.0_285.0K 600.0_550.0MB 310.0K 335.0K 700.0_500.0MB 325.0_335.0K 300.0K 0.0MAXOMEGA 315.0_325.0K 325.0K 340.0Ke 300.0_250.0MB 1.5TILT 335.0_345.0K 315.0K 3.4TILT 330.0Ke 500.0_400.0MB 305.0K 285.0_295.0Ke 14.0TILT 325.0_335.0Ke 850.0_800.0MB 295.0Ke 305.0Ke 265.0_275.0K 700.0_650.0MB 450.0_400.0MB 1.8TILT 330.0_340.0K 800.0_700.0MB 850.0_300.0MB 6.0TILT 900.0_850.0MB 320.0_330.0Ke 8.7TILT 650.0_600.0MB 600.0_400.0MB 55.0TILT 270.0_280.0Ke 30.0TILT 310.0_320.0K 1000.0_950.0MB 250.0_200.0MB 400.0_300.0MB 500.0_100.0MB 285.0Ke 290.0K 305.0_315.0K 285.0_295.0K 925.0_850.0MB 275.0Ke 300.0_200.0MB 260.0_270.0K 315.0_325.0Ke 600.0_500.0MB 16.7TILT 280.0K 500.0_250.0MB 40.0TILT 400.0_200.0MB 300.0_310.0Ke 270.0_280.0K 1000.0_700.0MB 45.0TILT 850.0_500.0MB 295.0K 4.3TILT 295.0_305.0K 330.0_340.0Ke 270.0K 280.0_290.0K 925.0_700.0MB 260.0K 10.0TILT 325.0Ke 285.0K 290.0_300.0Ke 7.5TILT 280.0Ke 500.0_450.0MB 305.0_315.0Ke 250.0K 250.0_350.0K 270.0Ke 275.0K 315.0Ke 500.0_300.0MB 350.0_300.0MB 19.5TILT 850.0_700.0MB 350.0K 265.0K 0.0_0.0SFC 0.0SFC is the Surface level FHAG stands for Fixed Height Above Ground (in meters) NTAT stands for Nominal Top of the ATmosphere BL stands for Boundary Layer, where 0.0_30.0BL reads as 0-30 mb above ground level TROP is the Tropopause level request.setLevels() \uf0c1 For this example we will use Surface Temperature request.setLevels(\"2.0FHAG\") DataAccessLayer.getAvailableTimes() \uf0c1 getAvailableTimes(request, True) will return an object of run times - formatted as YYYY-MM-DD HH:MM:SS getAvailableTimes(request) will return an object of all times - formatted as YYYY-MM-DD HH:MM:SS (F:ff) getForecastRun(cycle, times) will return a DataTime array for a single forecast cycle. Request a Grid \uf0c1 DataAccessLayer.getGridData() \uf0c1 Now that we have our request and DataTime fcstRun arrays ready, it's time to request the data array from EDEX. cycles = DataAccessLayer.getAvailableTimes(request, True) times = DataAccessLayer.getAvailableTimes(request) fcstRun = DataAccessLayer.getForecastRun(cycles[-1], times) response = DataAccessLayer.getGridData(request, [fcstRun[-1]]) for grid in response: data = grid.getRawData() lons, lats = grid.getLatLonCoords() print('Time :', str(grid.getDataTime())) print('Model:', str(grid.getLocationName())) print('Parm :', str(grid.getParameter())) print('Unit :', str(grid.getUnit())) print(data.shape) print(data.min(), data.max()) ('Time :', '2017-08-14 14:00:00 (21)') ('Model:', 'RAP13') ('Parm :', 'T') ('Unit :', 'K') (337, 451) (271.21939, 306.71939)","title":"Python awips data access"},{"location":"python/python-awips-data-access/#install","text":"pip install python-awips","title":"Install"},{"location":"python/python-awips-data-access/#requirements","text":"Python 2.7+ Numpy 1.7+ Shapely 1.4+ MetPy and enum34 to run Jupyter Notebook examples","title":"Requirements"},{"location":"python/python-awips-data-access/#example","text":"This examples covers the callable methods of the Python AWIPS DAF when working with gridded data. We start with a connection to an EDEX server, then query data types, then grid names, parameters, levels, and other information. Finally the gridded data is plotted for its domain using Matplotlib and Cartopy.","title":"Example"},{"location":"python/python-awips-data-access/#dataaccesslayerchangeedexhost","text":"After DataAccessLayer is imported from the package awips.dataaccess , the first step is to define the EDEX data server hostname ( edex-cloud.unidata.ucar.edu for these examples) from awips.dataaccess import DataAccessLayer DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\")","title":"DataAccessLayer.changeEDEXHost()"},{"location":"python/python-awips-data-access/#dataaccesslayergetsupporteddatatypes","text":"getSupportedDatatypes() returns a list of available data types offered by the EDEX server defined above. dataTypes = DataAccessLayer.getSupportedDatatypes() list(dataTypes) ['acars', 'airep', 'binlightning', 'bufrmosavn', 'bufrmoseta', 'bufrmosgfs', 'bufrmoshpc', 'bufrmoslamp', 'bufrmosmrf', 'bufrua', 'climate', 'common_obs_spatial', 'ffmp', 'gfe', 'grid', 'hydro', 'maps', 'modelsounding', 'obs', 'pirep', 'practicewarning', 'profiler', 'radar', 'radar_spatial', 'satellite', 'sfcobs', 'topo', 'warning']","title":"DataAccessLayer.getSupportedDatatypes()"},{"location":"python/python-awips-data-access/#dataaccesslayernewdatarequest","text":"Now create a new data request, and set the data type to grid and \"locationName\" to RAP40 with setDataType() and setLocationNames() request = DataAccessLayer.newDataRequest() request.setDatatype(\"grid\")","title":"DataAccessLayer.newDataRequest()"},{"location":"python/python-awips-data-access/#dataaccesslayergetavailablelocationnames","text":"With datatype set to \"grid\", we can query all available grid names with getAvailableLocationNames() available_grids = DataAccessLayer.getAvailableLocationNames(request) available_grids.sort() list(available_grids) ['CMC', 'EKDMOS', 'EKDMOS-AK', 'ESTOFS', 'ETSS', 'FFG-ALR', 'FFG-FWR', 'FFG-KRF', 'FFG-MSR', 'FFG-ORN', 'FFG-PTR', 'FFG-RHA', 'FFG-RSA', 'FFG-STR', 'FFG-TAR', 'FFG-TIR', 'FFG-TUA', 'FNMOC-FAROP', 'FNMOC-NCODA', 'FNMOC-WW3', 'FNMOC-WW3-Europe', 'GFS', 'GFS20', 'GFSLAMP5', 'GribModel:9:151:172', 'HFR-EAST_6KM', 'HFR-EAST_PR_6KM', 'HFR-US_EAST_DELAWARE_1KM', 'HFR-US_EAST_FLORIDA_2KM', 'HFR-US_EAST_NORTH_2KM', 'HFR-US_EAST_SOUTH_2KM', 'HFR-US_EAST_VIRGINIA_1KM', 'HFR-US_HAWAII_1KM', 'HFR-US_HAWAII_2KM', 'HFR-US_HAWAII_6KM', 'HFR-US_WEST_500M', 'HFR-US_WEST_CENCAL_2KM', 'HFR-US_WEST_LOSANGELES_1KM', 'HFR-US_WEST_LOSOSOS_1KM', 'HFR-US_WEST_NORTH_2KM', 'HFR-US_WEST_SANFRAN_1KM', 'HFR-US_WEST_SOCAL_2KM', 'HFR-US_WEST_WASHINGTON_1KM', 'HFR-WEST_6KM', 'HPCGuide', 'HPCqpf', 'HPCqpfNDFD', 'HRRR', 'LAMP2p5', 'MOSGuide', 'MPE-Local-ALR', 'MPE-Local-FWR', 'MPE-Local-MSR', 'MPE-Local-ORN', 'MPE-Local-RHA', 'MPE-Local-SJU', 'MPE-Local-STR', 'MPE-Local-TAR', 'MPE-Local-TIR', 'MPE-Mosaic-ALR', 'MPE-Mosaic-FWR', 'MPE-Mosaic-MSR', 'MPE-Mosaic-ORN', 'MPE-Mosaic-RHA', 'MPE-Mosaic-SJU', 'MPE-Mosaic-TAR', 'MPE-Mosaic-TIR', 'NAM12', 'NAM40', 'NAVGEM', 'NCWF', 'NDFD', 'NOHRSC-SNOW', 'NamDNG', 'PROB3HR', 'QPE-ALR', 'QPE-Auto-TUA', 'QPE-FWR', 'QPE-KRF', 'QPE-MSR', 'QPE-Manual-KRF', 'QPE-ORN', 'QPE-RFC-PTR', 'QPE-RFC-RSA', 'QPE-RFC-STR', 'QPE-TIR', 'QPE-TUA', 'QPE-XNAV-ALR', 'QPE-XNAV-FWR', 'QPE-XNAV-KRF', 'QPE-XNAV-MSR', 'QPE-XNAV-ORN', 'QPE-XNAV-SJU', 'QPE-XNAV-TAR', 'QPE-XNAV-TIR', 'QPE-XNAV-TUA', 'RAP13', 'RAP20', 'RAP40', 'RFCqpf', 'RTMA', 'RTOFS-Now-WestAtl', 'RTOFS-Now-WestConus', 'RTOFS-WestAtl', 'RTOFS-WestConus', 'SeaIce', 'TPCWindProb', 'UKMET-MODEL1', 'URMA25', 'WaveWatch']","title":"DataAccessLayer.getAvailableLocationNames()"},{"location":"python/python-awips-data-access/#set-grid-name-with-setlocationnames","text":"request.setLocationNames(\"RAP13\")","title":"Set grid name with setLocationNames()"},{"location":"python/python-awips-data-access/#list-available-parameters-for-a-grid","text":"","title":"List Available Parameters for a Grid"},{"location":"python/python-awips-data-access/#dataaccesslayergetavailableparameters","text":"After datatype and model name (locationName) are set, you can query all available parameters with getAvailableParameters() availableParms = DataAccessLayer.getAvailableParameters(request) availableParms.sort() list(availableParms) ['0to5', '2xTP6hr', 'AV', 'Along', 'AppT', 'BLI', 'BlkMag', 'BlkShr', 'CAPE', 'CFRZR', 'CICEP', 'CIn', 'CP', 'CP1hr', 'CPr', 'CPrD', 'CRAIN', 'CSNOW', 'CURU', 'CXR', 'CapeStk', 'Corf', 'CorfF', 'CorfFM', 'CorfM', 'CritT1', 'DIABi', 'DivF', 'DivFn', 'DivFs', 'DpD', 'DpDt', 'DpT', 'Dpress', 'DthDt', 'EHI', 'EHI01', 'EHIi', 'EPT', 'EPTA', 'EPTC', 'EPTGrd', 'EPTGrdM', 'EPTs', 'EPVg', 'EPVs', 'EPVt1', 'EPVt2', 'FVecs', 'FeatMot', 'FnVecs', 'FsVecs', 'Fzra1', 'Fzra2', 'GH', 'GHxSM', 'GHxSM2', 'Gust', 'HI', 'HI1', 'HI3', 'HI4', 'HIdx', 'HPBL', 'Heli', 'Into', 'KI', 'L-I', 'LIsfc2x', 'LgSP1hr', 'MAdv', 'MCon', 'MCon2', 'MMSP', 'MSFDi', 'MSFi', 'MSFmi', 'MSG', 'MTV', 'Mix1', 'Mix2', 'Mmag', 'MnT', 'MpV', 'MxT', 'NBE', 'NetIO', 'OmDiff', 'P', 'PAdv', 'PBE', 'PFrnt', 'PGrd', 'PGrd1', 'PGrdM', 'PIVA', 'PR', 'PTvA', 'PTyp', 'PVV', 'PW', 'PW2', 'PoT', 'PoTA', 'QPV1', 'QPV2', 'QPV3', 'QPV4', 'REFC', 'RH', 'RH_001_bin', 'RH_002_bin', 'RM5', 'RMGH2', 'RRtype', 'RV', 'Rain1', 'Rain2', 'Rain3', 'Ro', 'SH', 'SHx', 'SLI', 'SNW', 'SNWA', 'SRMm', 'SRMmM', 'SSi', 'Shear', 'ShrMag', 'SnD', 'Snow1', 'Snow2', 'Snow3', 'SnowT', 'St-Pr', 'St-Pr1hr', 'StrTP', 'StrmMot', 'T', 'TAdv', 'TGrd', 'TGrdM', 'TP', 'TP12hr', 'TP168hr', 'TP1hr', 'TP24hr', 'TP36hr', 'TP3hr', 'TP48hr', 'TP6hr', 'TP72hr', 'TPrun', 'TPx12x6', 'TPx1x3', 'TQIND', 'TV', 'TW', 'T_001_bin', 'Tdef', 'Tdend', 'ThGrd', 'TmDpD', 'Tmax', 'Tmin', 'TotQi', 'Tstk', 'TwMax', 'TwMin', 'Twstk', 'TxSM', 'USTM', 'VAdv', 'VAdvAdvection', 'VSTM', 'Vis', 'WD', 'WEASD', 'WEASD1hr', 'WGS', 'Wind', 'WndChl', 'ageoVC', 'ageoW', 'ageoWM', 'cCape', 'cCin', 'cTOT', 'capeToLvl', 'dCape', 'dGH12', 'dP', 'dP1hr', 'dP3hr', 'dP6hr', 'dPW1hr', 'dPW3hr', 'dPW6hr', 'dT', 'dVAdv', 'dZ', 'defV', 'del2gH', 'df', 'fGen', 'fnD', 'fsD', 'gamma', 'gammaE', 'geoVort', 'geoW', 'geoWM', 'mixRat', 'msl-P', 'muCape', 'pV', 'pVeq', 'qDiv', 'qVec', 'qnVec', 'qsVec', 'shWlt', 'snoRatCrocus', 'snoRatEMCSREF', 'snoRatSPC', 'snoRatSPCdeep', 'snoRatSPCsurface', 'swtIdx', 'tTOT', 'tWind', 'tWindU', 'tWindV', 'uFX', 'uW', 'vSmthW', 'vTOT', 'vW', 'vertCirc', 'wDiv', 'wSp', 'wSp_001_bin', 'wSp_002_bin', 'wSp_003_bin', 'wSp_004_bin', 'zAGL']","title":"DataAccessLayer.getAvailableParameters()"},{"location":"python/python-awips-data-access/#setparameters","text":"set the request parameter request.setParameters(\"T\")","title":"setParameters()"},{"location":"python/python-awips-data-access/#list-available-levels-for-parameter","text":"Using DataAccessLayer.getAvailableLevels() availableLevels = DataAccessLayer.getAvailableLevels(request) for level in availableLevels: print(level) 0.0SFC 350.0MB 475.0MB 225.0MB 120.0_150.0BL 900.0MB 125.0MB 450.0MB 575.0MB 325.0MB 100.0MB 1000.0MB 60.0_90.0BL 275.0MB 1.0PV 950.0MB 150.0MB 1.5PV 700.0MB 825.0MB 150.0_180.0BL 250.0MB 375.0MB 1000.0_500.0MB 800.0MB 925.0MB 2.0PV 0.5PV 0.0TROP 750.0MB 500.0MB 625.0MB 400.0MB 0.0FHAG 2.0FHAG 875.0MB 175.0MB 850.0MB 600.0MB 725.0MB 975.0MB 550.0MB 675.0MB 425.0MB 200.0MB 0.0_30.0BL 30.0_60.0BL 650.0MB 525.0MB 300.0MB 90.0_120.0BL 775.0MB 0.0TILT 0.5TILT 340.0_350.0K 290.0_300.0K 700.0_600.0MB 700.0_300.0MB 320.0Ke 800.0_750.0MB 60.0TILT 5.3TILT 1000.0_900.0MB 340.0K 255.0K 255.0_265.0K 25.0TILT 1000.0_850.0MB 850.0_250.0MB 280.0_290.0Ke 320.0_330.0K 310.0_320.0Ke 310.0Ke 330.0K 900.0_800.0MB 550.0_500.0MB 2.4TILT 50.0TILT 35.0TILT 12.0TILT 300.0_310.0K 0.9TILT 320.0K 400.0_350.0MB 750.0_700.0MB 345.0K 250.0_260.0K 300.0Ke 290.0Ke 950.0_900.0MB 275.0_285.0Ke 335.0Ke 295.0_305.0Ke 275.0_285.0K 600.0_550.0MB 310.0K 335.0K 700.0_500.0MB 325.0_335.0K 300.0K 0.0MAXOMEGA 315.0_325.0K 325.0K 340.0Ke 300.0_250.0MB 1.5TILT 335.0_345.0K 315.0K 3.4TILT 330.0Ke 500.0_400.0MB 305.0K 285.0_295.0Ke 14.0TILT 325.0_335.0Ke 850.0_800.0MB 295.0Ke 305.0Ke 265.0_275.0K 700.0_650.0MB 450.0_400.0MB 1.8TILT 330.0_340.0K 800.0_700.0MB 850.0_300.0MB 6.0TILT 900.0_850.0MB 320.0_330.0Ke 8.7TILT 650.0_600.0MB 600.0_400.0MB 55.0TILT 270.0_280.0Ke 30.0TILT 310.0_320.0K 1000.0_950.0MB 250.0_200.0MB 400.0_300.0MB 500.0_100.0MB 285.0Ke 290.0K 305.0_315.0K 285.0_295.0K 925.0_850.0MB 275.0Ke 300.0_200.0MB 260.0_270.0K 315.0_325.0Ke 600.0_500.0MB 16.7TILT 280.0K 500.0_250.0MB 40.0TILT 400.0_200.0MB 300.0_310.0Ke 270.0_280.0K 1000.0_700.0MB 45.0TILT 850.0_500.0MB 295.0K 4.3TILT 295.0_305.0K 330.0_340.0Ke 270.0K 280.0_290.0K 925.0_700.0MB 260.0K 10.0TILT 325.0Ke 285.0K 290.0_300.0Ke 7.5TILT 280.0Ke 500.0_450.0MB 305.0_315.0Ke 250.0K 250.0_350.0K 270.0Ke 275.0K 315.0Ke 500.0_300.0MB 350.0_300.0MB 19.5TILT 850.0_700.0MB 350.0K 265.0K 0.0_0.0SFC 0.0SFC is the Surface level FHAG stands for Fixed Height Above Ground (in meters) NTAT stands for Nominal Top of the ATmosphere BL stands for Boundary Layer, where 0.0_30.0BL reads as 0-30 mb above ground level TROP is the Tropopause level","title":"List Available Levels for Parameter"},{"location":"python/python-awips-data-access/#requestsetlevels","text":"For this example we will use Surface Temperature request.setLevels(\"2.0FHAG\")","title":"request.setLevels()"},{"location":"python/python-awips-data-access/#dataaccesslayergetavailabletimes","text":"getAvailableTimes(request, True) will return an object of run times - formatted as YYYY-MM-DD HH:MM:SS getAvailableTimes(request) will return an object of all times - formatted as YYYY-MM-DD HH:MM:SS (F:ff) getForecastRun(cycle, times) will return a DataTime array for a single forecast cycle.","title":"DataAccessLayer.getAvailableTimes()"},{"location":"python/python-awips-data-access/#request-a-grid","text":"","title":"Request a Grid"},{"location":"python/python-awips-data-access/#dataaccesslayergetgriddata","text":"Now that we have our request and DataTime fcstRun arrays ready, it's time to request the data array from EDEX. cycles = DataAccessLayer.getAvailableTimes(request, True) times = DataAccessLayer.getAvailableTimes(request) fcstRun = DataAccessLayer.getForecastRun(cycles[-1], times) response = DataAccessLayer.getGridData(request, [fcstRun[-1]]) for grid in response: data = grid.getRawData() lons, lats = grid.getLatLonCoords() print('Time :', str(grid.getDataTime())) print('Model:', str(grid.getLocationName())) print('Parm :', str(grid.getParameter())) print('Unit :', str(grid.getUnit())) print(data.shape) print(data.min(), data.max()) ('Time :', '2017-08-14 14:00:00 (21)') ('Model:', 'RAP13') ('Parm :', 'T') ('Unit :', 'K') (337, 451) (271.21939, 306.71939)","title":"DataAccessLayer.getGridData()"},{"location":"python/satellite-imagery/","text":"Satellite images are returned by Python AWIPS as grids, and can be rendered with Cartopy pcolormesh the same as gridded forecast models in other python-awips examples. %matplotlib inline from awips.dataaccess import DataAccessLayer import cartopy.crs as ccrs import cartopy.feature as cfeat import matplotlib.pyplot as plt from cartopy.mpl.gridliner import LONGITUDE_FORMATTER, LATITUDE_FORMATTER import numpy as np import datetime DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest() request.setDatatype(\"satellite\") Available Satellite Sectors and Products \uf0c1 availableSectors = DataAccessLayer.getAvailableLocationNames(request) availableSectors.sort() print(\"\\nAvailable sectors and products\\n\") for sect in availableSectors: request.setLocationNames(sect) availableProducts = DataAccessLayer.getAvailableParameters(request) availableProducts.sort() print(sect + \":\") for prod in availableProducts: print(\" - \"+prod) Available sectors and products Alaska National: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder Based Derived Precipitable Water (PW) Alaska Regional: Imager 11 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible East CONUS: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Low cloud base imagery GOES-East: Imager 11 micron IR Imager 13 micron IR Imager 3.5-4.0 micron IR (Fog) Imager 6.7-6.5 micron IR (WV) Imager Visible GOES-East-West: Imager 11 micron IR Imager 13 micron IR Imager 3.5-4.0 micron IR (Fog) Imager 6.7-6.5 micron IR (WV) Imager Visible GOES-Sounder: CAPE Sounder Based Derived Lifted Index (LI) Sounder Based Derived Precipitable Water (PW) Sounder Based Derived Surface Skin Temp (SFC Skin) Sounder Based Total Column Ozone GOES-West: Imager 11 micron IR Imager 13 micron IR Imager 3.5-4.0 micron IR (Fog) Imager 6.7-6.5 micron IR (WV) Imager Visible Global: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Hawaii National: Gridded Cloud Amount Gridded Cloud Top Pressure or Height Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder 11.03 micron imagery Sounder 14.06 micron imagery Sounder 3.98 micron imagery Sounder 4.45 micron imagery Sounder 6.51 micron imagery Sounder 7.02 micron imagery Sounder 7.43 micron imagery Sounder Based Derived Lifted Index (LI) Sounder Based Derived Precipitable Water (PW) Sounder Based Derived Surface Skin Temp (SFC Skin) Sounder Visible imagery Hawaii Regional: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Mollweide: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) NEXRCOMP: DHR DVL EET HHC N0R N1P NTP NH Composite - Meteosat-GOES E-GOES W-GMS: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Northern Hemisphere Composite: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Puerto Rico National: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder Based Derived Precipitable Water (PW) Puerto Rico Regional: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Supernational: Gridded Cloud Amount Gridded Cloud Top Pressure or Height Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder Based Derived Lifted Index (LI) Sounder Based Derived Precipitable Water (PW) Sounder Based Derived Surface Skin Temp (SFC Skin) West CONUS: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Low cloud base imagery Sounder 11.03 micron imagery Sounder 14.06 micron imagery Sounder 3.98 micron imagery Sounder 4.45 micron imagery Sounder 6.51 micron imagery Sounder 7.02 micron imagery Sounder 7.43 micron imagery Sounder Visible imagery Plot Global Water Vapor Composite \uf0c1 request.setLocationNames(\"Global\") availableProducts = DataAccessLayer.getAvailableParameters(request) availableProducts.sort() request.setParameters(availableProducts[0]) utc = datetime.datetime.utcnow() times = DataAccessLayer.getAvailableTimes(request) hourdiff = utc - datetime.datetime.strptime(str(times[-1]),'%Y-%m-%d %H:%M:%S') hours,days = hourdiff.seconds/3600,hourdiff.days minute = str((hourdiff.seconds - (3600 * hours)) / 60) offsetStr = '' if hours > 0: offsetStr += str(hours) + \"hr \" offsetStr += str(minute) + \"m ago\" if days > 1: offsetStr = str(days) + \" days ago\" print(\"Found \"+ str(len(times)) +\" available times\") print(\" \"+str(times[0]) + \"\\n to\\n \" + str(times[-1])) print(\"Using \"+str(times[-1]) + \" (\"+offsetStr+\")\") Found 96 available times 2017-01-23 00:00:00 to 2017-02-03 21:00:00 Using 2017-02-03 21:00:00 (2hr 3m ago) response = DataAccessLayer.getGridData(request, [times[-1]]) grid = response[0] data = grid.getRawData() lons,lats = grid.getLatLonCoords() bbox = [lons.min(), lons.max(), lats.min(), lats.max()] print(\"grid size \" + str(data.shape)) print(\"grid extent \" + str(list(bbox))) grid size (1024, 2048) grid extent [-179.91191, 179.99982, -89.977936, 89.890022] def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(18,14), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax fig, ax = make_map(bbox=bbox) # State boundaries states = cfeat.NaturalEarthFeature(category='cultural', name='admin_1_states_provinces_lines', scale='50m', facecolor='none') ax.add_feature(states, linestyle=':') cs = ax.pcolormesh(lons, lats, data, cmap='Greys_r') cbar = fig.colorbar(cs, shrink=0.9, orientation='horizontal') cbar.set_label(str(grid.getLocationName())+\" \" \\ +str(grid.getParameter())+\" \" \\ +str(grid.getDataTime().getRefTime())) plt.tight_layout()","title":"Satellite imagery"},{"location":"python/satellite-imagery/#available-satellite-sectors-and-products","text":"availableSectors = DataAccessLayer.getAvailableLocationNames(request) availableSectors.sort() print(\"\\nAvailable sectors and products\\n\") for sect in availableSectors: request.setLocationNames(sect) availableProducts = DataAccessLayer.getAvailableParameters(request) availableProducts.sort() print(sect + \":\") for prod in availableProducts: print(\" - \"+prod) Available sectors and products Alaska National: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder Based Derived Precipitable Water (PW) Alaska Regional: Imager 11 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible East CONUS: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Low cloud base imagery GOES-East: Imager 11 micron IR Imager 13 micron IR Imager 3.5-4.0 micron IR (Fog) Imager 6.7-6.5 micron IR (WV) Imager Visible GOES-East-West: Imager 11 micron IR Imager 13 micron IR Imager 3.5-4.0 micron IR (Fog) Imager 6.7-6.5 micron IR (WV) Imager Visible GOES-Sounder: CAPE Sounder Based Derived Lifted Index (LI) Sounder Based Derived Precipitable Water (PW) Sounder Based Derived Surface Skin Temp (SFC Skin) Sounder Based Total Column Ozone GOES-West: Imager 11 micron IR Imager 13 micron IR Imager 3.5-4.0 micron IR (Fog) Imager 6.7-6.5 micron IR (WV) Imager Visible Global: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Hawaii National: Gridded Cloud Amount Gridded Cloud Top Pressure or Height Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder 11.03 micron imagery Sounder 14.06 micron imagery Sounder 3.98 micron imagery Sounder 4.45 micron imagery Sounder 6.51 micron imagery Sounder 7.02 micron imagery Sounder 7.43 micron imagery Sounder Based Derived Lifted Index (LI) Sounder Based Derived Precipitable Water (PW) Sounder Based Derived Surface Skin Temp (SFC Skin) Sounder Visible imagery Hawaii Regional: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Mollweide: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) NEXRCOMP: DHR DVL EET HHC N0R N1P NTP NH Composite - Meteosat-GOES E-GOES W-GMS: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Northern Hemisphere Composite: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Puerto Rico National: Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder Based Derived Precipitable Water (PW) Puerto Rico Regional: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Supernational: Gridded Cloud Amount Gridded Cloud Top Pressure or Height Imager 11 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Percent of Normal TPW Rain fall rate Sounder Based Derived Lifted Index (LI) Sounder Based Derived Precipitable Water (PW) Sounder Based Derived Surface Skin Temp (SFC Skin) West CONUS: Imager 11 micron IR Imager 13 micron IR Imager 3.9 micron IR Imager 6.7-6.5 micron IR (WV) Imager Visible Low cloud base imagery Sounder 11.03 micron imagery Sounder 14.06 micron imagery Sounder 3.98 micron imagery Sounder 4.45 micron imagery Sounder 6.51 micron imagery Sounder 7.02 micron imagery Sounder 7.43 micron imagery Sounder Visible imagery","title":"Available Satellite Sectors and Products"},{"location":"python/satellite-imagery/#plot-global-water-vapor-composite","text":"request.setLocationNames(\"Global\") availableProducts = DataAccessLayer.getAvailableParameters(request) availableProducts.sort() request.setParameters(availableProducts[0]) utc = datetime.datetime.utcnow() times = DataAccessLayer.getAvailableTimes(request) hourdiff = utc - datetime.datetime.strptime(str(times[-1]),'%Y-%m-%d %H:%M:%S') hours,days = hourdiff.seconds/3600,hourdiff.days minute = str((hourdiff.seconds - (3600 * hours)) / 60) offsetStr = '' if hours > 0: offsetStr += str(hours) + \"hr \" offsetStr += str(minute) + \"m ago\" if days > 1: offsetStr = str(days) + \" days ago\" print(\"Found \"+ str(len(times)) +\" available times\") print(\" \"+str(times[0]) + \"\\n to\\n \" + str(times[-1])) print(\"Using \"+str(times[-1]) + \" (\"+offsetStr+\")\") Found 96 available times 2017-01-23 00:00:00 to 2017-02-03 21:00:00 Using 2017-02-03 21:00:00 (2hr 3m ago) response = DataAccessLayer.getGridData(request, [times[-1]]) grid = response[0] data = grid.getRawData() lons,lats = grid.getLatLonCoords() bbox = [lons.min(), lons.max(), lats.min(), lats.max()] print(\"grid size \" + str(data.shape)) print(\"grid extent \" + str(list(bbox))) grid size (1024, 2048) grid extent [-179.91191, 179.99982, -89.977936, 89.890022] def make_map(bbox, projection=ccrs.PlateCarree()): fig, ax = plt.subplots(figsize=(18,14), subplot_kw=dict(projection=projection)) ax.set_extent(bbox) ax.coastlines(resolution='50m') gl = ax.gridlines(draw_labels=True) gl.xlabels_top = gl.ylabels_right = False gl.xformatter = LONGITUDE_FORMATTER gl.yformatter = LATITUDE_FORMATTER return fig, ax fig, ax = make_map(bbox=bbox) # State boundaries states = cfeat.NaturalEarthFeature(category='cultural', name='admin_1_states_provinces_lines', scale='50m', facecolor='none') ax.add_feature(states, linestyle=':') cs = ax.pcolormesh(lons, lats, data, cmap='Greys_r') cbar = fig.colorbar(cs, shrink=0.9, orientation='horizontal') cbar.set_label(str(grid.getLocationName())+\" \" \\ +str(grid.getParameter())+\" \" \\ +str(grid.getDataTime().getRefTime())) plt.tight_layout()","title":"Plot Global Water Vapor Composite"},{"location":"python/surface-obs-plot-metpy/","text":"Based on the MetPy example \"Station Plot with Layout\" import datetime import pandas import matplotlib.pyplot as plt import numpy as np import pprint from awips.dataaccess import DataAccessLayer from metpy.calc import get_wind_components from metpy.cbook import get_test_data from metpy.plots.wx_symbols import sky_cover, current_weather from metpy.plots import StationPlot, StationPlotLayout, simple_layout from metpy.units import units def get_cloud_cover(code): if 'OVC' in code: return 1.0 elif 'BKN' in code: return 6.0/8.0 elif 'SCT' in code: return 4.0/8.0 elif 'FEW' in code: return 2.0/8.0 else: return 0 state_capital_wx_stations = {'Washington':'KOLM', 'Oregon':'KSLE', 'California':'KSAC', 'Nevada':'KCXP', 'Idaho':'KBOI', 'Montana':'KHLN', 'Utah':'KSLC', 'Arizona':'KDVT', 'New Mexico':'KSAF', 'Colorado':'KBKF', 'Wyoming':'KCYS', 'North Dakota':'KBIS', 'South Dakota':'KPIR', 'Nebraska':'KLNK', 'Kansas':'KTOP', 'Oklahoma':'KPWA', 'Texas':'KATT', 'Louisiana':'KBTR', 'Arkansas':'KLIT', 'Missouri':'KJEF', 'Iowa':'KDSM', 'Minnesota':'KSTP', 'Wisconsin':'KMSN', 'Illinois':'KSPI', 'Mississippi':'KHKS', 'Alabama':'KMGM', 'Nashville':'KBNA', 'Kentucky':'KFFT', 'Indiana':'KIND', 'Michigan':'KLAN', 'Ohio':'KCMH', 'Georgia':'KFTY', 'Florida':'KTLH', 'South Carolina':'KCUB', 'North Carolina':'KRDU', 'Virginia':'KRIC', 'West Virginia':'KCRW', 'Pennsylvania':'KCXY', 'New York':'KALB', 'Vermont':'KMPV', 'New Hampshire':'KCON', 'Maine':'KAUG', 'Massachusetts':'KBOS', 'Rhode Island':'KPVD', 'Connecticut':'KHFD', 'New Jersey':'KTTN', 'Delaware':'KDOV', 'Maryland':'KNAK'} single_value_params = [\"timeObs\", \"stationName\", \"longitude\", \"latitude\", \"temperature\", \"dewpoint\", \"windDir\", \"windSpeed\", \"seaLevelPress\"] multi_value_params = [\"presWeather\", \"skyCover\", \"skyLayerBase\"] all_params = single_value_params + multi_value_params obs_dict = dict({all_params: [] for all_params in all_params}) pres_weather = [] sky_cov = [] sky_layer_base = [] from dynamicserialize.dstypes.com.raytheon.uf.common.time import TimeRange from datetime import datetime, timedelta lastHourDateTime = datetime.utcnow() - timedelta(hours = 1) start = lastHourDateTime.strftime('%Y-%m-%d %H') beginRange = datetime.strptime( start + \":00:00\", \"%Y-%m-%d %H:%M:%S\") endRange = datetime.strptime( start + \":59:59\", \"%Y-%m-%d %H:%M:%S\") timerange = TimeRange(beginRange, endRange) DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest() request.setDatatype(\"obs\") request.setParameters(*(all_params)) request.setLocationNames(*(state_capital_wx_stations.values())) response = DataAccessLayer.getGeometryData(request,timerange) for ob in response: avail_params = ob.getParameters() if \"presWeather\" in avail_params: pres_weather.append(ob.getString(\"presWeather\")) elif \"skyCover\" in avail_params and \"skyLayerBase\" in avail_params: sky_cov.append(ob.getString(\"skyCover\")) sky_layer_base.append(ob.getNumber(\"skyLayerBase\")) else: for param in single_value_params: if param in avail_params: if param == 'timeObs': obs_dict[param].append(datetime.fromtimestamp(ob.getNumber(param)/1000.0)) else: try: obs_dict[param].append(ob.getNumber(param)) except TypeError: obs_dict[param].append(ob.getString(param)) else: obs_dict[param].append(None) obs_dict['presWeather'].append(pres_weather); obs_dict['skyCover'].append(sky_cov); obs_dict['skyLayerBase'].append(sky_layer_base); pres_weather = [] sky_cov = [] sky_layer_base = [] We can now use pandas to retrieve desired subsets of our observations. In this case, return the most recent observation for each station. df = pandas.DataFrame(data=obs_dict, columns=all_params) #sort rows with the newest first df = df.sort_values(by='timeObs', ascending=False) #group rows by station groups = df.groupby('stationName') #create a new DataFrame for the most recent values df_recent = pandas.DataFrame(columns=all_params) #retrieve the first entry for each group, which will #be the most recent observation for rid, station in groups: row = station.head(1) df_recent = pandas.concat([df_recent, row]) Convert DataFrame to something metpy-readable by attaching units and calculating derived values data = dict() data['stid'] = np.array(df_recent[\"stationName\"]) data['latitude'] = np.array(df_recent['latitude']) data['longitude'] = np.array(df_recent['longitude']) data['air_temperature'] = np.array(df_recent['temperature'], dtype=float)* units.degC data['dew_point'] = np.array(df_recent['dewpoint'], dtype=float)* units.degC data['slp'] = np.array(df_recent['seaLevelPress'])* units('mbar') u, v = get_wind_components(np.array(df_recent['windSpeed']) * units('knots'), np.array(df_recent['windDir']) * units.degree) data['eastward_wind'], data['northward_wind'] = u, v data['cloud_frac'] = [int(get_cloud_cover(x)*8) for x in df_recent['skyCover']] %matplotlib inline import cartopy.crs as ccrs import cartopy.feature as feat from matplotlib import rcParams rcParams['savefig.dpi'] = 100 proj = ccrs.LambertConformal(central_longitude=-95, central_latitude=35, standard_parallels=[35]) state_boundaries = feat.NaturalEarthFeature(category='cultural', name='admin_1_states_provinces_lines', scale='110m', facecolor='none') # Create the figure fig = plt.figure(figsize=(20, 15)) ax = fig.add_subplot(1, 1, 1, projection=proj) # Add map elements ax.add_feature(feat.LAND, zorder=-1) ax.add_feature(feat.OCEAN, zorder=-1) ax.add_feature(feat.LAKES, zorder=-1) ax.coastlines(resolution='110m', zorder=2, color='black') ax.add_feature(state_boundaries) ax.add_feature(feat.BORDERS, linewidth='2', edgecolor='black') ax.set_extent((-120, -70, 20, 50)) # Start the station plot by specifying the axes to draw on, as well as the # lon/lat of the stations (with transform). We also set the fontsize to 12 pt. stationplot = StationPlot(ax, data['longitude'], data['latitude'], transform=ccrs.PlateCarree(), fontsize=12) # The layout knows where everything should go, and things are standardized using # the names of variables. So the layout pulls arrays out of `data` and plots them # using `stationplot`. simple_layout.plot(stationplot, data) # Plot the temperature and dew point to the upper and lower left, respectively, of # the center point. Each one uses a different color. stationplot.plot_parameter('NW', np.array(data['air_temperature']), color='red') stationplot.plot_parameter('SW', np.array(data['dew_point']), color='darkgreen') # A more complex example uses a custom formatter to control how the sea-level pressure # values are plotted. This uses the standard trailing 3-digits of the pressure value # in tenths of millibars. stationplot.plot_parameter('NE', np.array(data['slp']), formatter=lambda v: format(10 * v, '.0f')[-3:]) # Plot the cloud cover symbols in the center location. This uses the codes made above and # uses the `sky_cover` mapper to convert these values to font codes for the # weather symbol font. stationplot.plot_symbol('C', data['cloud_frac'], sky_cover) # Also plot the actual text of the station id. Instead of cardinal directions, # plot further out by specifying a location of 2 increments in x and 0 in y. stationplot.plot_text((2, 0), np.array(obs_dict[\"stationName\"])) plt.title(\"Most Recent Observations for State Capitals\")","title":"Surface obs plot metpy"},{"location":"python/upper-air-bufr-soundings/","text":"The following script takes you through the steps of retrieving an Upper Air vertical profile from an AWIPS EDEX server and plotting a Skew-T/Log-P chart with Matplotlib and MetPy. The bufrua plugin returns separate objects for parameters at mandatory levels and at significant temperature levels . For the Skew-T/Log-P plot, significant temperature levels are used to plot the pressure, temperature, and dewpoint lines, while mandatory levels are used to plot the wind profile. %matplotlib inline from awips.dataaccess import DataAccessLayer import matplotlib.tri as mtri import matplotlib.pyplot as plt from mpl_toolkits.axes_grid1.inset_locator import inset_axes import numpy as np import math from metpy.calc import get_wind_speed, get_wind_components, lcl, dry_lapse, parcel_profile from metpy.plots import SkewT, Hodograph from metpy.units import units, concatenate # Set host DataAccessLayer.changeEDEXHost(\"edex-cloud.unidata.ucar.edu\") request = DataAccessLayer.newDataRequest() # Set data type request.setDatatype(\"bufrua\") availableLocs = DataAccessLayer.getAvailableLocationNames(request) availableLocs.sort() # Set Mandatory and Significant Temperature level parameters MAN_PARAMS = set(['prMan', 'htMan', 'tpMan', 'tdMan', 'wdMan', 'wsMan']) SIGT_PARAMS = set(['prSigT', 'tpSigT', 'tdSigT']) request.setParameters(\"wmoStaNum\", \"validTime\", \"rptType\", \"staElev\", \"numMand\", \"numSigT\", \"numSigW\", \"numTrop\", \"numMwnd\", \"staName\") request.getParameters().extend(MAN_PARAMS) request.getParameters().extend(SIGT_PARAMS) # Set station ID (not name) request.setLocationNames(\"72562\") #KLBF # Get all times datatimes = DataAccessLayer.getAvailableTimes(request) # Get most recent record response = DataAccessLayer.getGeometryData(request,times=datatimes[-1].validPeriod) # Initialize data arrays tdMan,tpMan,prMan,wdMan,wsMan = np.array([]),np.array([]),np.array([]),np.array([]),np.array([]) prSig,tpSig,tdSig = np.array([]),np.array([]),np.array([]) manGeos = [] sigtGeos = [] # Build arrays for ob in response: if set(ob.getParameters()) & MAN_PARAMS: manGeos.append(ob) prMan = np.append(prMan,ob.getNumber(\"prMan\")) tpMan = np.append(tpMan,ob.getNumber(\"tpMan\")) tdMan = np.append(tdMan,ob.getNumber(\"tdMan\")) wdMan = np.append(wdMan,ob.getNumber(\"wdMan\")) wsMan = np.append(wsMan,ob.getNumber(\"wsMan\")) continue if set(ob.getParameters()) & SIGT_PARAMS: sigtGeos.append(ob) prSig = np.append(prSig,ob.getNumber(\"prSigT\")) tpSig = np.append(tpSig,ob.getNumber(\"tpSigT\")) tdSig = np.append(tdSig,ob.getNumber(\"tdSigT\")) continue # Sort mandatory levels (but not sigT levels) because of the 1000.MB interpolation inclusion ps = prMan.argsort()[::-1] wpres = prMan[ps] direc = wdMan[ps] spd = wsMan[ps] tman = tpMan[ps] dman = tdMan[ps] # Flag missing data prSig[prSig <= -9999] = np.nan tpSig[tpSig <= -9999] = np.nan tdSig[tdSig <= -9999] = np.nan wpres[wpres <= -9999] = np.nan tman[tman <= -9999] = np.nan dman[dman <= -9999] = np.nan direc[direc <= -9999] = np.nan spd[spd <= -9999] = np.nan # assign units p = (prSig/100) * units.mbar T = (tpSig-273.15) * units.degC Td = (tdSig-273.15) * units.degC wpres = (wpres/100) * units.mbar tman = tman * units.degC dman = dman * units.degC u,v = get_wind_components(spd, np.deg2rad(direc)) # Create SkewT/LogP plt.rcParams['figure.figsize'] = (8, 10) skew = SkewT() skew.plot(p, T, 'r', linewidth=2) skew.plot(p, Td, 'g', linewidth=2) skew.plot_barbs(wpres, u, v) skew.ax.set_ylim(1000, 100) skew.ax.set_xlim(-30, 30) title_string = \" T(F) Td \" title_string += \" \" + str(ob.getString(\"staName\")) title_string += \" \" + str(ob.getDataTime().getRefTime()) title_string += \" (\" + str(ob.getNumber(\"staElev\")) + \"m elev)\" title_string += \"\\n\" + str(round(T[0].to('degF').item(),1)) title_string += \" \" + str(round(Td[0].to('degF').item(),1)) plt.title(title_string, loc='left') # Calculate LCL height and plot as black dot l = lcl(p[0], T[0], Td[0]) lcl_temp = dry_lapse(concatenate((p[0], l)), T[0])[-1].to('degC') skew.plot(l, lcl_temp, 'ko', markerfacecolor='black') # Calculate full parcel profile and add to plot as black line prof = parcel_profile(p, T[0], Td[0]).to('degC') skew.plot(p, prof, 'k', linewidth=2) # An example of a slanted line at constant T -- in this case the 0 isotherm l = skew.ax.axvline(0, color='c', linestyle='--', linewidth=2) # Draw hodograph ax_hod = inset_axes(skew.ax, '30%', '30%', loc=3) h = Hodograph(ax_hod, component_range=max(wsMan)) h.add_grid(increment=20) h.plot_colormapped(u, v, spd) # Show the plot plt.show()","title":"Upper air bufr soundings"},{"location":"raytheon/cave_d2d/","text":"Raytheon: CAVE D2D User's Manual (13.4.1) \uf0c1 Note: This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: CAVE User's Manual"},{"location":"raytheon/cave_d2d/#raytheon-cave-d2d-users-manual-1341","text":"Note: This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: CAVE D2D User's Manual (13.4.1)"},{"location":"raytheon/smm/","text":"Raytheon: System Manager's Manual (13.4.1) \uf0c1 Note: This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: AWIPS System Manager's Manual"},{"location":"raytheon/smm/#raytheon-system-managers-manual-1341","text":"Note: This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: System Manager's Manual (13.4.1)"}]}
\ No newline at end of file
+{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"Unidata AWIPS User Manual \uf0c1 https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the Unidata Program Center (UPC) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. Unidata supports two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and python-awips (a python package). Note : Our version of CAVE is a non-operational version. It does not support some features of NWS AWIPS. Warnings and alerts cannot be issued from Unidata's CAVE. Additional functionality may not be available as well. Download and Install CAVE \uf0c1 Download and Install EDEX \uf0c1 Work with Python-AWIPS \uf0c1 License \uf0c1 Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS license information can be found here . AWIPS Data in the Cloud \uf0c1 Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after). Distributed Computing \uf0c1 AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, Unidata modified the package to be more applicable in the University setting. Because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX Software Components \uf0c1 EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES EDEX \uf0c1 The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX CAVE \uf0c1 Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE LDM \uf0c1 https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop edexBridge \uf0c1 edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing. Qpid \uf0c1 http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd PostgreSQL \uf0c1 http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres HDF5 \uf0c1 http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ . PyPIES \uf0c1 PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .","title":"Home"},{"location":"#unidata-awips-user-manual","text":"https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the Unidata Program Center (UPC) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. Unidata supports two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and python-awips (a python package). Note : Our version of CAVE is a non-operational version. It does not support some features of NWS AWIPS. Warnings and alerts cannot be issued from Unidata's CAVE. Additional functionality may not be available as well.","title":"Unidata AWIPS User Manual"},{"location":"#download-and-install-cave","text":"","title":"Download and Install CAVE"},{"location":"#download-and-install-edex","text":"","title":"Download and Install EDEX"},{"location":"#work-with-python-awips","text":"","title":"Work with Python-AWIPS"},{"location":"#license","text":"Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS license information can be found here .","title":"License"},{"location":"#awips-data-in-the-cloud","text":"Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after).","title":"AWIPS Data in the Cloud"},{"location":"#distributed-computing","text":"AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, Unidata modified the package to be more applicable in the University setting. Because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX","title":"Distributed Computing"},{"location":"#software-components","text":"EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES","title":"Software Components"},{"location":"#edex","text":"The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX","title":"EDEX"},{"location":"#cave","text":"Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE","title":"CAVE"},{"location":"#ldm","text":"https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop","title":"LDM"},{"location":"#edexbridge","text":"edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing.","title":"edexBridge"},{"location":"#qpid","text":"http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd","title":"Qpid"},{"location":"#postgresql","text":"http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres","title":"PostgreSQL"},{"location":"#hdf5","text":"http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ .","title":"HDF5"},{"location":"#pypies","text":"PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .","title":"PyPIES"},{"location":"appendix/appendix-acronyms/","text":"A \uf0c1 ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System B \uf0c1 BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data C \uf0c1 CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity D \uf0c1 D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array E \uf0c1 ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory F \uf0c1 FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory G \uf0c1 GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division H \uf0c1 HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup I \uf0c1 ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T J \uf0c1 JMS - Java Messaging System K \uf0c1 KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed). L \uf0c1 LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report M \uf0c1 MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level N \uf0c1 NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service O \uf0c1 OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request P \uf0c1 PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency Q \uf0c1 QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary R \uf0c1 RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP) S \uf0c1 SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability T \uf0c1 TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts U \uf0c1 UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time V \uf0c1 VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile W \uf0c1 W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor Z \uf0c1 Z - Reflectivity ZDR - Differential Reflectivity","title":"Acronyms and Abbreviations"},{"location":"appendix/appendix-acronyms/#a","text":"ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System","title":"A"},{"location":"appendix/appendix-acronyms/#b","text":"BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data","title":"B"},{"location":"appendix/appendix-acronyms/#c","text":"CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity","title":"C"},{"location":"appendix/appendix-acronyms/#d","text":"D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array","title":"D"},{"location":"appendix/appendix-acronyms/#e","text":"ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory","title":"E"},{"location":"appendix/appendix-acronyms/#f","text":"FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory","title":"F"},{"location":"appendix/appendix-acronyms/#g","text":"GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division","title":"G"},{"location":"appendix/appendix-acronyms/#h","text":"HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup","title":"H"},{"location":"appendix/appendix-acronyms/#i","text":"ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T","title":"I"},{"location":"appendix/appendix-acronyms/#j","text":"JMS - Java Messaging System","title":"J"},{"location":"appendix/appendix-acronyms/#k","text":"KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed).","title":"K"},{"location":"appendix/appendix-acronyms/#l","text":"LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report","title":"L"},{"location":"appendix/appendix-acronyms/#m","text":"MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level","title":"M"},{"location":"appendix/appendix-acronyms/#n","text":"NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service","title":"N"},{"location":"appendix/appendix-acronyms/#o","text":"OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request","title":"O"},{"location":"appendix/appendix-acronyms/#p","text":"PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency","title":"P"},{"location":"appendix/appendix-acronyms/#q","text":"QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary","title":"Q"},{"location":"appendix/appendix-acronyms/#r","text":"RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP)","title":"R"},{"location":"appendix/appendix-acronyms/#s","text":"SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability","title":"S"},{"location":"appendix/appendix-acronyms/#t","text":"TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts","title":"T"},{"location":"appendix/appendix-acronyms/#u","text":"UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time","title":"U"},{"location":"appendix/appendix-acronyms/#v","text":"VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile","title":"V"},{"location":"appendix/appendix-acronyms/#w","text":"W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor","title":"W"},{"location":"appendix/appendix-acronyms/#z","text":"Z - Reflectivity ZDR - Differential Reflectivity","title":"Z"},{"location":"appendix/appendix-cots/","text":"Python for AWIPS \uf0c1 Component Version Description Python 2.7.13 Dynamic programming language python-awips 18.1.7 Python AWIPS Data Access Framework Cycler 0.10.0 Python library for composable style cycles Cython 0.28.3 Superset of the Python programming language, designed to give C-like performance with code that is mostly written in Python dateutil 2.7.3 Python extension to the standard datetime module NumPy 1.9.3 Numerical Python Scientific package for Python matplotlib 1.5.3 Python 2D Plotting Library Jep 3.7.1 3.8.2 Java Python interface h5py 1.3.0 HDF5 for Python PyDev 5.4.0 Python Development Environment PyParsing 2.2.0 Python class library for the easy construction of recursive-descent parsers Python QPID 1.36.0 Python API for Qpid Messaging PyTables 3.4.2 Python package for managing hierarchical datasets pytz 2015.4 World Timezone Definitions for Python Setuptools 28.6.0 Tools to download, build, install, upgrade, and uninstall Python packages ScientificPython 2.8.1 Python library for common tasks in scientific computing Shapely 1.6.4 Python package for manipulation and analysis of planar geometric objects. Six 1.11.0 Python 2 and 3 Compatibility Library stomp.py 4.1.20 Python client library for accessing messaging servers werkzeug 0.14.1 Python WSGI utility library YAJSW 12.09 Yet Another Java Service Wrapper Apache for AWIPS \uf0c1 Component Version Description ActiveMQ 5.14.2 JMS ActiveMQ Geronimo 1.1.1 Apache Batik 1.9 Batik is a Java-based toolkit for applications or applets that want to use images in the Scalable Vector Graphics (SVG) format for various purposes, such as display, generation or manipulation. Apache Camel 2.18.3 Enterprise Service Bus Apache Derby 10.12.1 Apache HTTP 4.3.6 Client and Core Apache HTTP Server 2.4.27 Apr 1.6.2 Apache Portable Runtime Project Apr-Util 1.6.0 Apache Portable Runtime Project commons-beanutils 1.9.3 Apache Common Libraries commons-codec 1.10 Apache Common Libraries commons-collections 3.2.2 Apache Common Libraries commons-configuration 1.10 Apache Common Libraries commons-compress 1.10 Apache Common Libraries commons-cli 1.2 Apache Common Libraries commons-digester 1.8.1 Apache Common Libraries commons-io 2.4 Apache Common Libraries commons-cxf 3.1.14 Apache Common Libraries commons-lang 2.6 Apache Common Libraries commons-lang3 3.4 Apache Common Libraries commons-management 1.0 Apache Common Libraries commons-net 3.3 Apache Common Libraries commons-pool 1.6 Apache Common Libraries commons-pool2 2.4.2 Apache Common Libraries commons-ssl Apache Common Libraries commons-validator 1.2.0 Apache Common Libraries Mime4J 0.7 Parser for e-mail message streams in plain rfc822 and MIME format MINA 1.1.7 Network application framework Qpid 6.1.4 Open Source AMQP (Advanced Message Queuing Protocol) Messaging Shiro 1.3.2 Java security framework Thrift 0.10.0 Binary Serialization Framework Velocity 1.7 Templating Engine WSS4J 2.1.4 Web Services Security Xalan 2.7.2 Xerces 2.9.1 XML Resolver 1.2 XML Security 2.0.6 XML Serializer 2.7.1 XML Beans 2.6.0 XML Graphics 2.2 XML Schema 2.1.0 Other COTS and FOSS \uf0c1 Component Version Description Ant 1.9.6 Java Build Tool Ant-Contrib 1.0b3 Additional useful tasks and types for Ant Antlr 2.7.6 Parser generator Atomikos TransactionEssentials 3.6.2 Transaction management system Bitstream Vera Fonts 1.10 Font library from Gnome Bouncy Castle jdk15on-1.54 Java implementation of cryptographic algorithms bzip2 0.9.1 Stream compression algorithm C3p0 0.9.1 c3p0 is an easy-to-use library for making traditional JDBC drivers \"enterprise-ready\" by augmenting them with functionality defined by the jdbc3 spec and the optional extensions to jdbc2. cglib 2.1 Byte Code Generation Library is high level API to generate and transform JAVA byte code. distcache 1.4.5-21 Distributed session caching dom4j 1.6.1 An open source library for working with XML, XPath, and XSLT on the Java platform using the Java Collections Framework OpenDAP 2 1.0.3 dwr (direct web remoting) Getahead 1.1.3 Java open source library Eclipse 4.6.1 Java IDE Eclipse Jetty 9.2.19 Servlet Engine and Http Server ehcache 1.3.0 Caching Support FITS Flexible Image Transport System GDAL 2.2.4 GEOS 3.6.2 Geometry Engine, Required for PostGIS GeoTools Java API 16.4 Java API for Manipulation of Geospatial Data GRIBJava 8.0 Grib Java Decoder Groovy 2.4.10 Guava 18.0 Google core libraries for Java Hamcrest 1.3 Java Hamcrest Matchers hdf5 1.8.4-patch1 Core HDF5 APIs hdf5 2.5 Core HDF5 APIs Hibernate 4.2.15 Data Access Layer HIbernate JPA 2.0 API 1.0.1 Hibernate API Istack 2.21 Common Utility Code Runtime IzPack 4.2.0 Installer creator for EDEX Jackson Databind 2.6.5 General data-binding functionality for Jackson JAI 1.1.3 Java API for Image Manipulation JAI \u2013 Image I/O 1.1 Plug-ins for JAI Jasper 1.900.1 JPEG-2000 codec Jasypt 1.9.2 Java simplified encryption Java jdk-8u101 Kit for both 32-bit and 64-bit Javax Servlet API 3.1.0 Jaxen 1.1.4 Open source X-Path Library Javassist 3.18.1 Java Programming Assistant for bytecode manipulation JCommander 1.72 Java framework for parsing command line parameters Jdom 1.1.3 Jdom2 2.0.6 jfreechart 1.0.19 JNA 4.1.0 Joda 2.9.9 Java date and time API jogamp 2.3.2 Provides hardware-supported 3D graphics JSR-275 1.0 beta Measures and Units JUnit 4.12 JTS Topology Suite 1.10 Java API for 2D spatial data lapack 3.4.2 Linear Algebra Package for python ldm 6.13.6 Local Data Manager Log4J 1.2.16 Logging Component used by Commons Logging Logback 1.2.0 libgfortran 4.1.2 Fortran Library Mchange Commons Java 0.2.3.4 Mchange c3p0 0.9.2.1 JDBC3 Connection and Statement Pooling Mockito 1.9.0 Mocking framework for unit tests written in Java mod_wsgi 3.5 Apache HTTP Server module that provides a WSGI compliant interface for hosting Python based web applications. Mozilla Rhino 1.6R7 Implementation of JavaScript embedded in Java NCAR NC2 Libraries 4.6.10 ucar.nc2 containing bufr, cdm, grib, httpservices, and udunits NCEP Grib2 Libraries Libraries for decoding & encoding data in GRIB2 format cnvgrib 1.1.8 and 11.9 Fortran GRIB1 <--> GRIB2 conversion utility g2clib 1.1.8 \"C\" grib2 encoder/decoder g2lib 1.1.8 and 1.1.9 Fortran grib2 encoder/decoder and search/indexing routines w3lib 1.6 and 1.7.1 Fortran grib1 encoder/decoder and utilities ObjectWeb ASM 2.2 ASM is an all-purpose Java bytecode manipulation and analysis framework. It can be used to modify existing classes or dynamically generate classes, directly in binary form ObjectWeb ASM OGC Tools GML JTS Converter 1.0.2 Opengis 1.0.2 OpenSAML 3.1.1 Portable implementation of the Security Assertion Markup Language (SAML) org.w3.xml.ext 1.3.04 Apache-hosted set of DOM, SAX, and JAXP interfaces OWASP Enterprise Security API 2.0.1 Open source web application security control library for programmers to write low-risk applications PNGJ 2.1.1 Java library for PNG image IO PostGIS 2.4.4 Geographic Object Support for PostgreSQL PostgreSQL 9.5.13 Database Proj 5.1.0 Cartographic Projections library Protocol Buffers 3.3.1 Core Protocol Buffers library Python megawidgets 1.3.2 Toolkit for building high-level compound widgets in Python using the Tkinter module Quartz 1.8.6 Enterprise Job Scheduler Reflections 0.9.9 Java runtime metadata analysis slf4j 1.7.21 The Simple Logging Facade for Java or (SLF4J) serves as a simple facade or abstraction for various logging frameworks smack 4.1.9 Open Source XMPP (Jabber) client library Spring Framework OSGI 1.2.0 dynamic modules Spring Framework 4.2.9 Layered Java/J2EE application platform Subclipse 1.4.8 Eclipse plugin for Subversion support SWT Add-ons 0.1.1 Add-ons for Eclipse SWT widgets Symphony OGNL 2.7.3 Object-Graph Navigation Language; an expression language for getting/setting properties of Java objects. SZIP 2.1 Compression in HDF Products. Tomcat Native 1.1.17 Library for native memory control UDUNITS 4.6.10 C library provides for arithmetic manipulation of units utilconcurrent 1.3.2 Utility classes Wildfire 3.1.1 Collaboration Server xmltask 1.15.1 Facility for automatically editing XML files as part of an Ant build Vecmath 1.3.1","title":"Appendix cots"},{"location":"appendix/appendix-cots/#python-for-awips","text":"Component Version Description Python 2.7.13 Dynamic programming language python-awips 18.1.7 Python AWIPS Data Access Framework Cycler 0.10.0 Python library for composable style cycles Cython 0.28.3 Superset of the Python programming language, designed to give C-like performance with code that is mostly written in Python dateutil 2.7.3 Python extension to the standard datetime module NumPy 1.9.3 Numerical Python Scientific package for Python matplotlib 1.5.3 Python 2D Plotting Library Jep 3.7.1 3.8.2 Java Python interface h5py 1.3.0 HDF5 for Python PyDev 5.4.0 Python Development Environment PyParsing 2.2.0 Python class library for the easy construction of recursive-descent parsers Python QPID 1.36.0 Python API for Qpid Messaging PyTables 3.4.2 Python package for managing hierarchical datasets pytz 2015.4 World Timezone Definitions for Python Setuptools 28.6.0 Tools to download, build, install, upgrade, and uninstall Python packages ScientificPython 2.8.1 Python library for common tasks in scientific computing Shapely 1.6.4 Python package for manipulation and analysis of planar geometric objects. Six 1.11.0 Python 2 and 3 Compatibility Library stomp.py 4.1.20 Python client library for accessing messaging servers werkzeug 0.14.1 Python WSGI utility library YAJSW 12.09 Yet Another Java Service Wrapper","title":"Python for AWIPS"},{"location":"appendix/appendix-cots/#apache-for-awips","text":"Component Version Description ActiveMQ 5.14.2 JMS ActiveMQ Geronimo 1.1.1 Apache Batik 1.9 Batik is a Java-based toolkit for applications or applets that want to use images in the Scalable Vector Graphics (SVG) format for various purposes, such as display, generation or manipulation. Apache Camel 2.18.3 Enterprise Service Bus Apache Derby 10.12.1 Apache HTTP 4.3.6 Client and Core Apache HTTP Server 2.4.27 Apr 1.6.2 Apache Portable Runtime Project Apr-Util 1.6.0 Apache Portable Runtime Project commons-beanutils 1.9.3 Apache Common Libraries commons-codec 1.10 Apache Common Libraries commons-collections 3.2.2 Apache Common Libraries commons-configuration 1.10 Apache Common Libraries commons-compress 1.10 Apache Common Libraries commons-cli 1.2 Apache Common Libraries commons-digester 1.8.1 Apache Common Libraries commons-io 2.4 Apache Common Libraries commons-cxf 3.1.14 Apache Common Libraries commons-lang 2.6 Apache Common Libraries commons-lang3 3.4 Apache Common Libraries commons-management 1.0 Apache Common Libraries commons-net 3.3 Apache Common Libraries commons-pool 1.6 Apache Common Libraries commons-pool2 2.4.2 Apache Common Libraries commons-ssl Apache Common Libraries commons-validator 1.2.0 Apache Common Libraries Mime4J 0.7 Parser for e-mail message streams in plain rfc822 and MIME format MINA 1.1.7 Network application framework Qpid 6.1.4 Open Source AMQP (Advanced Message Queuing Protocol) Messaging Shiro 1.3.2 Java security framework Thrift 0.10.0 Binary Serialization Framework Velocity 1.7 Templating Engine WSS4J 2.1.4 Web Services Security Xalan 2.7.2 Xerces 2.9.1 XML Resolver 1.2 XML Security 2.0.6 XML Serializer 2.7.1 XML Beans 2.6.0 XML Graphics 2.2 XML Schema 2.1.0","title":"Apache for AWIPS"},{"location":"appendix/appendix-cots/#other-cots-and-foss","text":"Component Version Description Ant 1.9.6 Java Build Tool Ant-Contrib 1.0b3 Additional useful tasks and types for Ant Antlr 2.7.6 Parser generator Atomikos TransactionEssentials 3.6.2 Transaction management system Bitstream Vera Fonts 1.10 Font library from Gnome Bouncy Castle jdk15on-1.54 Java implementation of cryptographic algorithms bzip2 0.9.1 Stream compression algorithm C3p0 0.9.1 c3p0 is an easy-to-use library for making traditional JDBC drivers \"enterprise-ready\" by augmenting them with functionality defined by the jdbc3 spec and the optional extensions to jdbc2. cglib 2.1 Byte Code Generation Library is high level API to generate and transform JAVA byte code. distcache 1.4.5-21 Distributed session caching dom4j 1.6.1 An open source library for working with XML, XPath, and XSLT on the Java platform using the Java Collections Framework OpenDAP 2 1.0.3 dwr (direct web remoting) Getahead 1.1.3 Java open source library Eclipse 4.6.1 Java IDE Eclipse Jetty 9.2.19 Servlet Engine and Http Server ehcache 1.3.0 Caching Support FITS Flexible Image Transport System GDAL 2.2.4 GEOS 3.6.2 Geometry Engine, Required for PostGIS GeoTools Java API 16.4 Java API for Manipulation of Geospatial Data GRIBJava 8.0 Grib Java Decoder Groovy 2.4.10 Guava 18.0 Google core libraries for Java Hamcrest 1.3 Java Hamcrest Matchers hdf5 1.8.4-patch1 Core HDF5 APIs hdf5 2.5 Core HDF5 APIs Hibernate 4.2.15 Data Access Layer HIbernate JPA 2.0 API 1.0.1 Hibernate API Istack 2.21 Common Utility Code Runtime IzPack 4.2.0 Installer creator for EDEX Jackson Databind 2.6.5 General data-binding functionality for Jackson JAI 1.1.3 Java API for Image Manipulation JAI \u2013 Image I/O 1.1 Plug-ins for JAI Jasper 1.900.1 JPEG-2000 codec Jasypt 1.9.2 Java simplified encryption Java jdk-8u101 Kit for both 32-bit and 64-bit Javax Servlet API 3.1.0 Jaxen 1.1.4 Open source X-Path Library Javassist 3.18.1 Java Programming Assistant for bytecode manipulation JCommander 1.72 Java framework for parsing command line parameters Jdom 1.1.3 Jdom2 2.0.6 jfreechart 1.0.19 JNA 4.1.0 Joda 2.9.9 Java date and time API jogamp 2.3.2 Provides hardware-supported 3D graphics JSR-275 1.0 beta Measures and Units JUnit 4.12 JTS Topology Suite 1.10 Java API for 2D spatial data lapack 3.4.2 Linear Algebra Package for python ldm 6.13.6 Local Data Manager Log4J 1.2.16 Logging Component used by Commons Logging Logback 1.2.0 libgfortran 4.1.2 Fortran Library Mchange Commons Java 0.2.3.4 Mchange c3p0 0.9.2.1 JDBC3 Connection and Statement Pooling Mockito 1.9.0 Mocking framework for unit tests written in Java mod_wsgi 3.5 Apache HTTP Server module that provides a WSGI compliant interface for hosting Python based web applications. Mozilla Rhino 1.6R7 Implementation of JavaScript embedded in Java NCAR NC2 Libraries 4.6.10 ucar.nc2 containing bufr, cdm, grib, httpservices, and udunits NCEP Grib2 Libraries Libraries for decoding & encoding data in GRIB2 format cnvgrib 1.1.8 and 11.9 Fortran GRIB1 <--> GRIB2 conversion utility g2clib 1.1.8 \"C\" grib2 encoder/decoder g2lib 1.1.8 and 1.1.9 Fortran grib2 encoder/decoder and search/indexing routines w3lib 1.6 and 1.7.1 Fortran grib1 encoder/decoder and utilities ObjectWeb ASM 2.2 ASM is an all-purpose Java bytecode manipulation and analysis framework. It can be used to modify existing classes or dynamically generate classes, directly in binary form ObjectWeb ASM OGC Tools GML JTS Converter 1.0.2 Opengis 1.0.2 OpenSAML 3.1.1 Portable implementation of the Security Assertion Markup Language (SAML) org.w3.xml.ext 1.3.04 Apache-hosted set of DOM, SAX, and JAXP interfaces OWASP Enterprise Security API 2.0.1 Open source web application security control library for programmers to write low-risk applications PNGJ 2.1.1 Java library for PNG image IO PostGIS 2.4.4 Geographic Object Support for PostgreSQL PostgreSQL 9.5.13 Database Proj 5.1.0 Cartographic Projections library Protocol Buffers 3.3.1 Core Protocol Buffers library Python megawidgets 1.3.2 Toolkit for building high-level compound widgets in Python using the Tkinter module Quartz 1.8.6 Enterprise Job Scheduler Reflections 0.9.9 Java runtime metadata analysis slf4j 1.7.21 The Simple Logging Facade for Java or (SLF4J) serves as a simple facade or abstraction for various logging frameworks smack 4.1.9 Open Source XMPP (Jabber) client library Spring Framework OSGI 1.2.0 dynamic modules Spring Framework 4.2.9 Layered Java/J2EE application platform Subclipse 1.4.8 Eclipse plugin for Subversion support SWT Add-ons 0.1.1 Add-ons for Eclipse SWT widgets Symphony OGNL 2.7.3 Object-Graph Navigation Language; an expression language for getting/setting properties of Java objects. SZIP 2.1 Compression in HDF Products. Tomcat Native 1.1.17 Library for native memory control UDUNITS 4.6.10 C library provides for arithmetic manipulation of units utilconcurrent 1.3.2 Utility classes Wildfire 3.1.1 Collaboration Server xmltask 1.15.1 Facility for automatically editing XML files as part of an Ant build Vecmath 1.3.1","title":"Other COTS and FOSS"},{"location":"appendix/appendix-grid-parameters/","text":"Abbreviation Description Units 0to5 t-5Day Mean Hgt m 2xTP6hr 12Hr Accum Precip from 2 6hr mm 36SHRMi S=Shear incr > 10kts 3-6km 50dbzZ 50dbz Hgt for 1 in. Svr Hail m accum_altimeter24 accum_altimeter24 Pa accum_dewpoint24 accum_dewpoint24 F accum_dpFromTenths24 accum_dpFromTenths24 accum_GH12 accum_GH12 m accum_htMan12 accum_htMan12 m accum_numMand12 accum_numMand12 accum_precip1Hour3 accum_precip1Hour3 in accum_precip1Hour6 accum_precip1Hour6 in accum_precip6Hour24 accum_precip6Hour24 in accum_prMan12 accum_prMan12 Pa accum_rawMETAR24 accum_rawMETAR24 accum_sfcPress3 accum_sfcPress3 Pa accum_temperature24 accum_temperatur24 in accum_tempFromTenths24 accum_tempFromTenths24 in accum_windDir24 accum_windDir24 in accum_windSpeed24 accum_windSpeed24 in ACOND Aerodynamic conductance m/s adimc Additional Impervious Area Water Content % ageoVC Ageo Vert Circ ageoW Ageo Wind m/s ageoWM Magnitude Ageo Wind m/s ALBDO Albedo % Along Component Along m/s Alt24Chg Alt24Chg Pa Alti Altimeter hPa ANCConvectiveOutlook ANC Convective Outlook ANCFinalForecast ANC Final Forecast dBZ ANCLayerCompositeReflectivity ANC Layer Composite Reflectivity dBZ AppT Apparent Temperature \u00b0F AV Absolute Vorticity /s AV Vorticity /s BARO Barometric Velocity Vectors m/s BASSW Spectrum Width kts BdEPT06 Max ThetaE Difference (3-6km Min minus 0-3km Max) K BGRUN Baseflow-Groundwater Runoff kg/m^2 BLI Best (4 layer) Lifted Index K BLI Best Lifted Index K BlkMag Bulk Shear Magnitude m/s BlkShr Bulk Shear Vectors m/s BMIXL Blackadar's Mixing Length Scale m BREFMaxHourly Hourly Base Reflectivity Maximum dBZ BrightBandBottomHeight Bright Band Bottom Height m BrightBandTopHeight Bright Band Top Height m BRN Net Bulk Richardson Number BRNEHIi 72% Supercell Cases Tornadic BRNmag m/s BRNSHR BRN Shear BRNvec m/s BRTMP Brightness Temperature K CAPE Convective Available Potential Energy J/kg CAPEc1 Prob CAPE > 500 J/kg % CAPEc2 Prob CAPE > 1000 J/kg % CAPEc3 Prob CAPE > 2000 J/kg % CAPEc4 Prob CAPE > 3000 J/kg % CAPEc5 Prob CAPE > 4000 J/kg % CapeStk Cape Stack capeToLvl cape up to level CAT Clear Air Turbulence % cCape Computed CAPE J/kg cCin Computed CIN J/kg CCOND Canopy Conductance m/s CCP Cloud Cover % CCPerranl Cloud Cover Analysis Uncertainty % CD Drag Coefficient Numeric CDCON Convective Cloud Cover % CDUVB Clear sky UV-B Downward Solar Flux W/m^2 CEIL Ceiling m CFRZR Categorical Freezing Rain CFRZR Categorical Freezing Rain bit CFRZRc1 Chc of Measurable FZRA (Dominant) % CFRZRmean Categorical Freezing Precip mean CFRZRsprd Categorical Freezing Precip sprd CIce Cloud Ice g/m^3 CICE Cloud Ice kg/m^2 CICEP Categorical Ice Pellets CICEP Categorical Ice Pellets bit CICEPc1 Chc of Measurable IP (Dominant) % CICEPmean Categorical Ice Pellets mean CICEPsprd Categorical Ice Pellets sprd Cig Ceiling Height Cigc1 Prob Ceiling Hgt < 500 ft % Cigc2 Prob Ceiling Hgt < 1000 ft % Cigc3 Prob Ceiling Hgt < 3000 ft % CIn Convective Inhibition J/kg ClCond Cloud Condensate g/m^3 CLGTN Categorical Lightning Potential CLGTN2hr 2hr Categorical Lightning Potential climoPW PW % of normal % climoPWimp Import NARR PW in CloudCover Cloud Cover K CLWMR Cloud Mixing Ratio kg/kg CnvP2hr 2hr Convective probability % CnvPcat Categorical convective potential CNWAT Plant Canopy Surface Water mm COCO Correlation Coefficient CompositeReflectivityMaxHourly Hourly Composite Reflectivity Maximum dBZ CONUSMergedReflectivity CONUS Merged Reflectivity dBZ CONUSMergedRHV CONUS Merged RhoHV CONUSMergedZDR CONUS Merged ZDR dB CONUSPlusMergedReflectivity CONUS-Plus Merged Reflectivity dBZ CONVP Categorical Convection Potential CONVP2hr 2hr Convection potential Corf Corfidi Vectors m/s CorfF Corfidi Vectors-Forward Prop kn CorfFM Corfidi Vec-Forward Mag kn CorfM Corfidi Vec Mag kn covCat Coverage Category % CP Conv Precip mm CP Convective Precipitation mm CP12hr Convective Precipitation(12 hours) mm CP1hr Convective Precipitation(1 hour) mm CP3hr Convective Precipitation(3 hours) mm CP6hr Convective Precipitation(6 hours) mm CP9hr Convective Precipitation(9 hours) mm CP-GFS Convective Precipitation for GFS mm CPOFP Percent of Frozen Precipitation % CPOFP Probability of Frozen precip % CPOFP Probability of Frozen Precip % CPOLP Probability of liquid precip % CPOP Categorical POP CPOZP Probability of Freezing Precip % CPOZP Probability of Freezing Precip % CPr Condensation Pressure hPa CPRAT Convective Precipitation Rate mm/s CPrD Condensation Pressure Deficit hPa CRAIN Categorical Rain CRAIN Categorical Rain bit CRAINc1 Chc of Measurable Rain (Dominant) % CRAINmean Categorical Rain mean CRAINsprd Categorical Rain sprd CritT1 Layer Min Temperature -6C, -10C K CSDLF Clear Sky Downward Long Wave Flux W/m^2 CSDSF Clear Sky Downward Solar Flux W/m^2 CSNOW Categorical Snow CSNOW Categorical Snow bit CSNOWc1 Chc of Measurable Snow (Dominant) % CSNOWmean Categorical Snow mean CSNOWsprd Categorical Snow sprd CSSI CO Svr Storm Idx CSULF Clear Sky Upward Long Wave Flux W/m^2 CSUSF Clear Sky Upward Solar Flux W/m^2 cTOT Cross Totals C CTSTM Categorical Tstorm CTyp Cloud Type CUEFI Convective Cloud Efficiency non-dim CumNrm Normalized Cumulative Shear /s CumShr Cumulative Shear m/s CURU Cu Rule 0>SKC,-1>SCT,-4 300J/Kg MLCape) HeliD Helicity (NCEP Delivered) m\u00b2/s\u00b2 HI Haines Index HI Haines Index Numeric HI1 Haines Stab Term HI3 HI1 Index Assign HI4 Moist Term Index Assign HIdx Heat Index K HIdx Heat Index K HighLayerCompositeReflectivity High Layer Composite Reflectivity (24-60 kft) dBZ HIWC HiWc K HPBL Height of Planetary Boundary Layer m HPBL Planetary Boundary Layer Height m HTSGW Total Significant Wave Height m HyC Hydrometer Conc g/m^3 ICAHT ICAO Standard Atmosphere Reference Height m ICEC Derived Radar Composite Proportion ICEC Ice Cover ICEC Ice Cover Proportion ICEG Ice growth rate m/s ICETK Ice Thickness m ICI Icing Severity Index ICIP Icing Probability % ICMR Ice Water Mixing Ratio ICNG Icing Potential % ICPRB Icing Probability % ICSEV Icing Severity Index ICSEV Icing severity non-dim ILW Int Liquid Water g/m^2 Into Component Into m/s INV Height of MaxTw above FrzLvl ft IP Icing Pot IPLayer SFC Cold Lyr Probs Toward SLEET ft IRBand4 Infrared Imagery K JFWPRB9-20 Fire Wx: Prob Wind >= 17.5 kts and RH < 20% % KDP Specific Differential Phase deg/km KI K Index K KI K Index K L-I Computed LI \u2103 L3EchoTop Level III High Resolution Enhanced Echo Top Mosaic kft L3VIL Level III High Resolution VIL Mosaic kg/m^2 LAND Land Cover (0=sea, 1=land) Proportion LANDN Land-sea coverage (nearest neighbor) [land=1,sea=0] LAPR Lapse Rate K/m latitude Latitude \u00b0 LatLon Earth Location LCDC Low Cloud Cover % LgSP Large Scale Precipitation mm LgSP1hr Large Scale Precipitation(1 hour) mm LgSP3hr Large Scale Precipitation(3 hour) mm LHF Latent Heat Flux W/m^2 LightningDensity15min CG Lightning Density (15 min.) Flashes/km^2/min LightningDensity1min CG Lightning Density (1 min.) Flashes/km^2/min LightningDensity30min CG Lightning Density (30 min.) Flashes/km^2/min LightningDensity5min CG Lightning Density (5 min.) Flashes/km^2/min LightningJumpGrid Lightning Jump LightningJumpGridMax5min Lightning Jump Max LightningProbabilityNext30min CG Lightning Probability (0-30 min.) % LightningProbabilityNext60min CG Lightning Probability (0-60 min.) % LIsfc2x Lifted Index Sfc to \u2103 LLCompositeReflectivity Low-Level Composite Reflectivity dBZ LLWSWind LLWSWind kts LM5 Bunkers Left-Moving Supercell m/s LM6 Elevated Left-Moving Supercell m/s loCape CAPE to 3kmAGL (Tv) J/kg longitude Longitude \u00b0 LowLayerCompositeReflectivity Low Layer Composite Reflectivity (0-24 kft) dBZ LSOIL Liquid soil moisture content (non-frozen) kg/m^2 lsrSample LSR Sample LtgP2hr 2hr Lightning probability % LtgPcat Categorical lightning potential LTNG Lightning non-dim LTNG Max 1hr Lightning Threat (flashes/km^2) LWHR Long-Wave Radiative Heating Rate K/s lzfpc Lower Zone Primary Free Water Content % lzfsc Lower Zone Secondary Free Water Content % lztwc Lower Zone Tension Water Content % MAdv Moisture Adv (g/kg)/s maritimeObscuredSkyIFR ft maritimeObscuredSkyLIFR ft maritimeObscuredSkyMVFR ft maritimeObscuredSkySymIFR maritimeObscuredSkySymLIFR maritimeObscuredSkySymMVFR maritimeObscuredSkySymVFR maritimeObscuredSkyVFR ft maritimeWind20T34 kn maritimeWind34T48 kn maritimeWind48T64 kn maritimeWind64P kn maritimeWindDir20T34 deg maritimeWindDir34T48 deg maritimeWindDir48T64 deg maritimeWindDir64P deg maritimeWindDirLow deg maritimeWindGust20T34 kn maritimeWindGust34T48 kn maritimeWindGust48T64 kn maritimeWindGust64P kn maritimeWindGustLow kn maritimeWindLow kn MaxDVV Max 1hr Downdraft Vertical Velocity m/s maxEPT Max ThetaE (0-3kmAgl) K MaxGRPL1hr Max Hourly Graupel kg/m^2 MaxREF1hr Max Hourly Reflectivity dBZ MAXRH Maximum Relative Humidity % MAXRH12hr 12-hour Maximum Rel Humidity % MAXRH3hr 3-hour Maximum Rel Humidity % MAXUPHL Max 1hr Updraft Helicity m^2/s^2 MAXUPHL Max Updraft Helicity m^2/s^2 MaxUPHL1hr Max Hourly Updft Helicity m^2/s^2 MaxUVV Max 1hr Updraft Vertical Velocity m/s MAXUW U Component of Hourly Maximum Wind Speed m/s MAXVW V Component of Hourly Maximum Wind Speed m/s MaxWGS1hr Max Hourly Wind Gust m/s MaxWHRRR Maximum 1hr Wind Gust m/s MaxWind1hr MaxWind1hr m/s MCDC Medium Cloud Cover % MCon Moisture Flux Div (g/kg)/s MCon2 Moisture Flux Div (Conv only) (g/kg)/s MCONV Horizontal Moisture Convergence kg/kg*s^m^2/s MergedAzShear02kmAGL Low-Level Azimuthal Shear (0-2km AGL) 1/s MergedAzShear36kmAGL Mid-Level Azimuthal Shear (3-6km AGL) 1/s MergedBaseReflectivity Raw Merged Base Reflectivity dBZ MergedBaseReflectivityQC Merged Base Reflectivity dBZ MergedReflectivityAtLowestAltitude Merged Reflectivity At Lowest Altitude (RALA) dBZ MergedReflectivityComposite Raw Composite Reflectivity Mosaic dBZ MergedReflectivityQCComposite Composite Reflectivity dBZ MergedReflectivityQComposite Composite Reflectivity Mosaic dBZ MESH Maximum Estimated Size of Hail (MESH) mm MESHTrack120min MESH Tracks (120 min. accum.) mm MESHTrack1440min MESH Tracks (1440 min. accum.) mm MESHTrack240min MESH Tracks (240 min. accum.) mm MESHTrack30min MESH Tracks (30 min. accum.) mm MESHTrack360min MESH Tracks (360 min. accum.) mm MESHTrack60min MESH Tracks (60 min. accum.) mm minEPT Min ThetaE (3-6kmAgl) K MINRH Minimum Relative Humidity % MINRH12hr 12-hour Minimum Rel Humidity % MINRH3hr 3-hour Minimum Rel Humidity % Mix1 850-1000 mx thk Mix2 Thickness: Wintery MIX MIXR Humidity Mixing Ratio kg/kg mixRat Mixing Ratio g/kg MLLCL ML LCL Height m Mmag Moisture Trans Mag g\u00b7m/(kg\u00b7s) MMP MCS Maintenance Probability % MMSP MSLP (MAPS Reduction) Pa MnT Minimum Temperature K MnT Minimum Temperature K MnT12hr 12-hr Minimum Temperature K MnT3hr 3-hr Minimum Temperature K MnT6hr 6-hr Minimum Temperature K MnT_avg Min Temp Ensemble Mean K MnT_perts Min Temp Perturbations K MnT_std Min Temp Ensemble Std Dev K ModelHeight0C Freezing Level Height m ModelSurfaceTemperature Surface Temperature C ModelWetbulbTemperature Wet Bulb Temperature C MountainMapperQPE01H QPE - Mountain Mapper (1 hr. accum.) mm MountainMapperQPE03H QPE - Mountain Mapper (3 hr. accum.) mm MountainMapperQPE06H QPE - Mountain Mapper (6 hr. accum.) mm MountainMapperQPE12H QPE - Mountain Mapper (12 hr. accum.) mm MountainMapperQPE24H QPE - Mountain Mapper (24 hr. accum.) mm MountainMapperQPE48H QPE - Mountain Mapper (48 hr. accum.) mm MountainMapperQPE72H QPE - Mountain Mapper (72 hr. accum.) mm MpV Saturated Geo Pot Vort K/hPa/s MRETag Echo Tops m MRMSVIL Vertically Integrated Liquid (VIL) kg/m^2 MRMSVIL120min VIL Max (120 min.) kg/m^2 MRMSVIL1440min VIL Max (1440 min.) kg/m^2 MRMSVILDensity Vertically Integrated Liquid (VIL) Density g/m^3 MSFDi Isen Moisture Stability Flux Div (g*hPa*m)/(kg*K*s^2) MSFi Isentropic Moisture Stability Flux g\u00b7hPa\u00b7m/(kg\u00b7K\u00b7s) MSFmi Isen Moisture Stability Flux Mag g\u00b7hPa\u00b7m/(kg\u00b7K\u00b7s) MSG Mont Strm Func m MSG Montgomery Stream Function m^2/s^2 msl-P MSL Pressure hPa msl-P2 MSL Pressure (2) hPa msl-P_avg MSL Press Ensemble Mean hPa msl-P_perts MSL Press Perturbations hPa msl-P_std MSL Press Ensemble Std Dev hPa MSL1 MSL1 ft MSL2 MSL2 ft MSL3 MSL3 ft MSL4 MSL4 ft MSL5 MSL5 ft MSLSA Altimeter hPa MTV Moisture Trans Vecs g\u00b7m/(kg\u00b7s) muCape Most Unstable CAPE J/kg MultiSensorP1QPE01H QPE - Multi Sensor P1 (1 hr. accum.) mm MultiSensorP1QPE03H QPE - Multi Sensor P1 (3 hr. accum.) mm MultiSensorP1QPE06H QPE - Multi Sensor P1 (6 hr. accum.) mm MultiSensorP1QPE12H QPE - Multi Sensor P1 (12 hr. accum.) mm MultiSensorP1QPE24H QPE - Multi Sensor P1 (24 hr. accum.) mm MultiSensorP1QPE48H QPE - Multi Sensor P1 (48 hr. accum.) mm MultiSensorP1QPE72H QPE - Multi Sensor P1 (72 hr. accum.) mm MultiSensorP2QPE01H QPE - Multi Sensor P2 (1 hr. accum.) mm MultiSensorP2QPE03H QPE - Multi Sensor P2 (3 hr. accum.) mm MultiSensorP2QPE06H QPE - Multi Sensor P2 (6 hr. accum.) mm MultiSensorP2QPE12H QPE - Multi Sensor P2 (12 hr. accum.) mm MultiSensorP2QPE24H QPE - Multi Sensor P2 (24 hr. accum.) mm MultiSensorP2QPE48H QPE - Multi Sensor P2 (48 hr. accum.) mm MultiSensorP2QPE72H QPE - Multi Sensor P2 (72 hr. accum.) mm MXDVV Max Downdraft Vertical Velocity m/s MXREF Max 1hr CAPPI dB MXSALB Maximum Snow Albedo % MxT Maximum Temperature K MxT Maximum Temperature K MxT12hr 12-hr Maximum Temperature K MxT3hr 3-hr Maximum Temperature K MxT6hr 6-hr Maximum Temperature K MxT_avg Max Temp Ensemble Mean K MxT_perts Max Temp Perturbations K MxT_std Max Temp Ensemble Std Dev K MXUVV Max Updraft Vertical Velocity m/s NBDSF Near IR Beam Downward Solar Flux W/m^2 NBE Neg Buoy Energy J/kg NDDSF Near IR Diffuse Downward Solar Flux W/m^2 NetIO Net Isen Adiabatic Omega Pa/s NLAT Latitude (-90 to 90) deg NST Nonsupercell Tornado (>1 NST Threat) NST1 Nonsupercell Tornado (>1 NST Threat NST2 Nonsupercell Tornado (>1 NST Threat numLevels Number of Levels O3MR Ozone Mixing Ratio kg/kg obscuredSky2IFR ft obscuredSky2LIFR ft obscuredSky2MVFR ft obscuredSky2VFR ft obscuredSky3IFR ft obscuredSky3LIFR ft obscuredSky3MVFR ft obscuredSky3VFR ft obscuredSkyIFR ft obscuredSkyLIFR ft obscuredSkyMVFR ft obscuredSkySym2IFR obscuredSkySym2LIFR obscuredSkySym2MVFR obscuredSkySym2VFR obscuredSkySym3IFR obscuredSkySym3LIFR obscuredSkySym3MVFR obscuredSkySym3VFR obscuredSkySymIFR obscuredSkySymLIFR obscuredSkySymMVFR obscuredSkySymVFR obscuredSkyVFR ft obsWind30T50 kn obsWind50P kn obsWindDir30T50 deg obsWindDir50P deg obsWindDirLow deg obsWindGust30T50 kn obsWindGust50P kn obsWindGustLow kn obsWindLow kn obVis Obstruction to Vision OGRD Current Vectors m/s OmDiff mb between -15C Omega and MaxOmega hPa ONE One OTIM Observation Time OZCON Ozone Concentration ppb OZMAX1 Ozone Daily Max from 1-hour Average ppbV OZMAX8 Ozone Daily Max from 8-hour Average ppbV P Pressure hPa P Pressure Pa PAdv Pressure Adv hPa/s PBE Pos Buoy Energy J/kg PBLREG Planetary Boundary Layer Regime PEC Precipitation Potential Placement in PEC_TT24 24h Cumulative Precip Potential Placement in PERPW Primary Wave Mean Period s PERPW Primary Wave Period s Perranl Pressure Analysis Uncertainty Pa Perranl Pressure Error Analysis Pa PERSW Secondary wave mean period s PERSW Secondary Wave Mean Period s PEVAP Potential Evaporation mm PEVPR Potential Evaporation Rate W/m^2 PFrnt 2-D Frontogenesis/Mag Fn K/m/s PGrd Pressure Gradient hPa/m PGrd1 Pressure Gradient dPa/km PGrdM Pressure Grad Mag hPa/m PICE Pecipitating ice content g/m^3 PIVA Thermal Wind Vort Adv /s pkPwr Peak Power dB PLI Parcel Lifted Index (to 500 mb) K PLIxc1 Prob LI < 0 % PLIxc2 Prob LI < -2 % PLIxc3 Prob LI < -4 % PLIxc4 Prob LI < -6 % PLIxc5 Prob LI < -8 % PMSL Pressure Reduced to MSL Pa PMSLmean Mean Sea Level Pressure mean hPa PMSLsprd Mean Sea Level Pressure sprd hPa poesDif11u3_7uIR POES 11u-3.7u Satellite GenericPixel POP Probability of precip % POP12hr 12hr precip probability % POP3hr 3hr precip probability % POP6 POP 6hr % POP6hr 6hr precip probability % POP_001 Prob of .1in/6hr Precip % POP_002 Prob of .3in/6hr Precip % POP_003 Prob of .6in/6hr Precip % POP_004 Prob of 1in/6hr Precip % POP_005 Prob of 2in/6hr Precip % POP_006 Prob of .1in/12hr Precip % POP_007 Prob of .3in/12hr Precip % POP_008 Prob of .6in/12hr Precip % POP_009 Prob of 1in/12hr Precip % POP_010 Prob of 2in/12hr Precip % POP_011 Prob of .05in/6hr Precip % POP_012 Prob of .05in/12hr Precip % POP_013 Prob of 1in/24hr Precip % POP_014 Prob of 2in/24hr Precip % POP_015 Prob of 2in/36hr Precip % POP_016 Prob of 2in/48hr Precip % POROS Soil Porosity Proportion POSH Probability of Severe Hail (POSH) % PoT Potential Temp K PoT Potential Temperature K PoTA Pot Temp Adv K/s PPAM Prob Precip abv nrml % PPAN Prob Precip abv nrml % PPAS Prob Precip abv nrml % PPBM Prob Precip blw nrml % PPBN Prob Precip blw nrml % PPBS Prob Precip blw nrml % PPFFG Probability of excessive rain % PPI Precipitation Probability Index % PPI1hr Precipitation Probability Index(1 hour) % PPI6hr Precipitation Probability Index(6 hour) % PPNN Prob Precip near nrml % PR Precip Rate mm/s PR Precipitation Rate mm/s prCloudHgt prCLoud converted to Hgt m prCloudHgtHi prCloudHgt when in hi layer m prCloudHgtLow prCloudHgt when in low layer m prCloudHgtMid prCloudHgt when in mid layer m prcp12hr 12hr probability of 0.01 inch of precip % prcp3hr 3hr probability of 0.01 inch of precip % prcp6hr 6hr probability of 0.01 inch of precip % Precip24Hr Precip24Hr in Precip3Hr Precip3Hr in Precip6Hr Precip6Hr in PrecipRate Radar Precipitation Rate (SPR) mm/hr PrecipType Surface Precipitation Type (SPT) PRESA Pressure Anomaly Pa PresStk Obsolete, replace later presWeather Present Weather Prob34 Prob of Wind Speed > 34 knots m/s Prob50 Prob of Wind Speed > 50 knots m/s Prob64 Prob of Wind Speed > 64 knots m/s ProbDpT50 Probability of Dewpoint temp > 50 degF % ProbDpT55 Probability of Dewpoint temp > 55 degF % ProbDpT60 Probability of Dewpoint temp > 60 degF % ProbDpT65 Probability of Dewpoint temp > 65 degF % ProbDpT70 Probability of Dewpoint temp > 70 degF % ProbVSS10p3Layer Prob Vertical Speed Shear > 20 kts % ProbVSS10p3Sfc Prob 0-2kft Shear > 20 kts % PROCON Probability of convection % PROCON2hr 2hr Convection probability % PROLGHT Lightning probability % PROLGHT2hr 2hr Lightning probability % PRP01H 1hr MRMS Radar-Only ARI year PRP03H 3hr MRMS Radar-Only ARI year PRP06H 6hr MRMS Radar-Only ARI year PRP12H 12hr MRMS Radar-Only ARI year PRP24H 24hr MRMS Radar-Only ARI year PRP30M 30min MRMS Radar-Only ARI year PRPMax Maximum MRMS Radar-Only ARI year PRSIGSV Total Probability of Extreme Severe Thunderstorms % PRSVR Total Probability of Severe Thunderstorms % Psfc Surface pressure hPa PT3 3 hr Pres Change hPa PTAM Prob Temp abv nrml % PTAN Prob Temp abv nrml % PTAS Prob Temp abv nrml % PTBM Prob Temp blw nrml % PTBN Prob Temp blw nrml % PTBS Prob Temp blw nrml % PTNN Prob Temp near nrml % Ptopo Surface pressure hPa PTOR Tornado Probability % PTvA Pot Vorticity Adv K/hPa/s*1.0E5 PTyp Precip Type PTypeRefIP Prob Precip Type is Refreezing Ice Pellets % pV Potential Vorticity K/hPa/s pVeq Equiv Pot Vort K/hPa/s PVORT Potential Vorticity m^2 kg^-1 s^-1 PVV Omega Pa/s PVV Vertical Velocity Pressure Pa/s PW Precipitable Water mm PW Preciptable H2O in PW2 Preciptable H2O >1.4 in. in PWmean Precipitable Water mean mm PWS34 Incremental Prob of wind speed >= 34 knots % PWS50 Incremental Prob of wind speed >= 50 knots % PWS64 Incremental Prob of wind speed >= 64 knots % PWsprd Precipitable Water sprd mm qDiv Div Q K/m^2/s*1.0E-12 QMAX Maximum specific humidity at 2m kg/kg QMIN Minimum specific humidity at 2m kg/kg qnVec Qn Vectors K/m^2/s QPECrestSoilMoisture QPE-CREST Soil Moisture % QPECrestStreamflow QPE-CREST Maximum Streamflow (m^3)*(s^-1) QPECrestUStreamflow QPE-CREST Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPEFFG01H 1hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFG03H 3hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFG06H 6hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFGMax Maximum MRMS Radar-Only QPE-to-FFG Ratio QPEHPStreamflow QPE-Hydrophobic Maximum Streamflow (m^3)*(s^-1) QPEHPUStreamflow QPE-Hydrophobic Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPESacSoilMoisture QPE-SAC-SMA Soil Moisture % QPESacStreamflow QPE-SAC-SMA Maximum Streamflow (m^3)*(s^-1) QPESacUStreamflow QPE-SAC-SMA Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPV1 QVec Conv K/m^2/s*1.0E-12 QPV2 Negative EPV* K/hPa/s QPV3 QPV Net QPV4 QG-EPV, RH>75% qsVec Qs Vectors K/m^2/s qVec Q Vectors K/m^2/s RadarAQI01H Radar Accumulation Quality Index 1 hour RadarAQI03H Radar Accumulation Quality Index 3 hour RadarAQI06H Radar Accumulation Quality Index 6 hour RadarAQI12H Radar Accumulation Quality Index 12 hour RadarAQI24H Radar Accumulation Quality Index 24 hour RadarAQI48H Radar Accumulation Quality Index 48 hour RadarAQI72H Radar Accumulation Quality Index 72 hour RadarOnlyQPE01H QPE - Radar Only (1 hr. accum.) mm RadarOnlyQPE03H QPE - Radar Only (3 hr. accum.) mm RadarOnlyQPE06H QPE - Radar Only (6 hr. accum.) mm RadarOnlyQPE12H QPE - Radar Only (12 hr. accum.) mm RadarOnlyQPE12Z QPE - Radar Only (Since 12Z accum.) mm RadarOnlyQPE15M QPE - Radar Only (15 min accum.) mm RadarOnlyQPE24H QPE - Radar Only (24 hr. accum.) mm RadarOnlyQPE48H QPE - Radar Only (48 hr. accum.) mm RadarOnlyQPE72H QPE - Radar Only (72 hr. accum.) mm RadarQualityIndex Radar Quality Index (RQI) RAIN Rain content g/m^3 Rain1 850-1000 ra thk Rain2 700-850 ra thk Rain3 Thickness: Rain Likely Raob Raob Interleaved Data rawMETAR24Chg rawMETAR24Chg \u2103 RCQ Humidity parameter in canopy conductance Proportion RCS Solar parameter in canopy conductance Proportion RCSOL Soil moisture parameter in canopy conductance Proportion Reflectivity0C Reflectivity at 0C dBZ ReflectivityAtLowestAltitude Reflectivity At Lowest Altitude (RALA) dBZ ReflectivityM10C Reflectivity at -10C dBZ ReflectivityM15C Reflectivity at -15C dBZ ReflectivityM20C Reflectivity at -20C dBZ ReflectivityM5C Reflectivity at -5C dBZ RETOP Echo Top m RH Rel Humidity % RH Relative Humidity % RH_001 Prob of RH Grtn 70 percent % RH_001_bin Binary Prob of RH Grtn 70 percent RH_001_perts Prob of RH Grtn 70 percent Perts RH_002 Prob of RH Grtn 90 percent % RH_002_bin Binary Prob of RH Grtn 90 percent RH_002_perts Prob of RH Grtn 90 percent Perts RH_avg Rel Humidity Ensemble Mean % RH_perts Rel Humidity Perturbations % RH_std Rel Humidity Ensemble Std Dev % RHmean Relative Humidity mean % RHsprd Relative Humidity spread % RIME Rime Factor non-dim RLYRS Number of Soil Layers in Root Zone Numeric RM5 Bunkers Right-Moving Supercell m/s RM6 Elevated Right-Moving Supercell m/s RMGH2 t-2Day Mean Hgt m RMprop Right Mover Propagation Vector RMprop2 Elevated Right Mover Propagation Vector rms root mean square kn Ro Rossby Number Vag/Vg RotationTrackLL120min Low-Level Rotation Tracks 0-2km AGL (120 min. accum.) 1/s RotationTrackLL1440min Low-Level Rotation Tracks 0-2km AGL (1440 min. accum.) 1/s RotationTrackLL240min Low-Level Rotation Tracks 0-2km AGL (240 min. accum.) 1/s RotationTrackLL30min Low-Level Rotation Tracks 0-2km AGL (30 min. accum.) 1/s RotationTrackLL360min Low-Level Rotation Tracks 0-2km AGL (360 min. accum.) 1/s RotationTrackLL60min Low-Level Rotation Tracks 0-2km AGL (60 min. accum.) 1/s RotationTrackML120min Mid-Level Rotation Tracks 3-6km AGL (120 min. accum.) 1/s RotationTrackML1440min Mid-Level Rotation Tracks 3-6km AGL (1440 min. accum.) 1/s RotationTrackML240min Mid-Level Rotation Tracks 3-6km AGL (240 min. accum.) 1/s RotationTrackML30min Mid-Level Rotation Tracks 3-6km AGL (30 min. accum.) 1/s RotationTrackML360min Mid-Level Rotation Tracks 3-6km AGL (360 min. accum.) 1/s RotationTrackML60min Mid-Level Rotation Tracks 3-6km AGL (60 min. accum.) 1/s routed_flow Channel Routed Flow [Low] routed_flow_c Channel Routed Flow [Combo] routed_flow_h Channel Routed Flow [Hi] routed_flow_m Channel Routed Flow [Mid] RR Reflectivity dBZ RRtype Radar w/PType dBZ RRV Radial Velocity kts RSMIN Minimal Stomatal Resistance s/m RV Rel Vorticity /s RWMR Rain Mixing Ratio kg/kg s2H2O_CLIMO Climatological -SON/DJF/MAM- Snow-to-water ratio s2H2O_GFS GFS Snow-to-water ratio s2H2O_MEAN HPC Mean Snow-to-water ratio s2H2O_NAM NAM Snow-to-water ratio SA12hr 12 Hr Snow Accum mm SA1hr 1 Hr Snow Accum mm SA24hr 24 Hr Snow Accum mm SA36hr 36 Hr Snow Accum mm SA3hr 3 Hr Snow Accum mm SA48hr 48 Hr Snow Accum mm SA6hr 6 Hr Snow Accum mm SAcc Snow Accum via Thickness mm SALIN Practical Salinity SALTY Salinity kg/kg SAmodel Model Run Snow via Thickness mm SArun Model Run Snow Accum via Thickness mm satCloudPhase Satellite Cloud Phase[8.5-11.2 um] K SATD Saturation Deficit Pa satDif11u12uIR 11u-12u Satellite GenericPixel satDif11u13uIR 11u-13u Satellite GenericPixel satDif11u3_9uIR 11u-3.9u Satellite GenericPixel satDivWVIR IR in WV Satellite DerivedWV satFog Satellite Fog[3.9-11.2 um] K satMoisture Satellite Moisture[11.2-12.3 um] K satSnow Satellite Snow[0.64-1.61 um] satUpperLevelInfo Satellite Upper Level Info[11.2-6.19 um] K satVegetation Satellite Vegetation[0.64-0.87 um] SBSNO Sublimation (evaporation from snow) W/m^2 SBT113 Simulated Brightness Temperature for GOES 11, Channel 3 K SBT114 Simulated Brightness Temperature for GOES 11, Channel 4 K SBT123 Simulated Brightness Temperature for GOES 12, Channel 3 K SBT124 Simulated Brightness Temperature for GOES 12, Channel 4 K sce NOHRSC Snow Coverage Elevation kft SCP Snow Cover SCP Snow Cover % SCWind SCWind m/s SDEN Snow Density kg/m\u00b3 SDENCLIMO Climatological -SON/DJF/MAM- Snow Density kg/m\u00b3 SDENGFS GFS Snow Density kg/m\u00b3 SDENMEAN HPC Mean Snow Density kg/m\u00b3 SDENNAM NAM Snow Density kg/m\u00b3 SeamlessHSR Seamless Hybrid Scan Reflectivity (SHSR) dBZ SeamlessHSRHeight Seamless Hybrid Scan Reflectivity (SHSR) Height km SFCR Surface Roughness m SH Spec Humidity SH Specific Humidity % Shear Shear (Vector) /s SHF Sensible Heat Flux W/m^2 SHI Severe Hail Index (SHI) ShrMag Shear Magnitude /s shWlt Showalter Index \u2103 SHx Spec Humidity g/kg SIGHAILPROB Significant Hail Probability % SIGTRNDPROB Significant Tornado Probability % SIGWINDPROB Significant Wind Probability % SIPD Supercooled Large Droplet Threat SLDP Supercooled Large Droplet Threat SLI Lifted Index K SLI Surface Lifted Index K SLTYP Surface Slope Type Index SMC Soil Moisture % SMDRY Direct Evaporation Cease (soil moisture) Proportion SMREF Transpiration Stress-onset (soil moisture) Proportion SnD Snow Depth m SnD Snow Depth m SNFALB Snow-Free Albedo SNMR Snow Mixing Ratio kg/kg SNOL12c1 Prob 12-hr SNOW > 1 in % SNOL12c10 Prob 12-hr SNOW > 24 in % SNOL12c2 Prob 12-hr SNOW > 2 in % SNOL12c3 Prob 12-hr SNOW > 4 in % SNOL12c4 Prob 12-hr SNOW > 6 in % SNOL12c5 Prob 12-hr SNOW > 7.5 in % SNOL12c6 Prob 12-hr SNOW > 8 in % SNOL12c7 Prob 12-hr SNOW > 10 in % SNOL12c8 Prob 12-hr SNOW > 12 in % SNOL12c9 Prob 12-hr SNOW > 16 in % SNOL12mean 12-hr Snowfall mean mm SNOL12sprd 12-hr Large scale Snowfall sprd mm SNOM Snow Melt kg/m^2 snoRat snoRatCrocus Snow Ratio - Crocus/ECMWF snoRatEMCSREF Snow Ratio: EMC SREF snoRatOv2 snoRatSPC Snow Ratio - SPC snoRatSPCdeep Snow Ratio - SPC 0-3km MaxT snoRatSPCsurface Snow Ratio - SPCsurface snoRatWPC Snow Ratio - WPC Mean SNOW Snow content g/m^3 Snow1 850-1000 sn thk Snow2 700-850 sn thk Snow3 Thickness: Snow Likely snowd3hr 3hr Snow Depth m snowd6hr 6hr Snow Depth m SNOWLVL Snow Level m SnowT Preferred Ice Growth K SNSQ Snow Sql Parameter SNW Sect Norm Wind m/s SNWA Ageo Sect Norm Wind kn SOILM Soil Moisture Content kg/m^2 SOILW Volumetric Soil Moisture Content Proportion SOTYP Soil Type SPAcc Storm Total Precip mm SPBARO Barotropic Velocity m/s SPC Current Speed m/s SPC Surface Current Speed m/s Spd24Chg Spd24Chg kn sRank Feature Strength Rank SRMl Storm Relative Flow Vectors LM m/s SRMlM Storm Relative Flow Mag LM m/s SRMm Storm Relative Flow Vecs (Mean Wind) m/s SRMmM Storm Relative Flow Mag (Mean Wind) m/s SRMr Storm Relative Flow Vecs (RM) m/s SRMrM Storm Relative Flow Mag (RM) m/s SSAcc Storm Total Snow mm SSi Isentropic Static Stability hPa/K SSP Significant Severe Parameter SSRUN Storm Surface Runoff kg/m^2 St-Pr Stable Precipitation mm St-Pr1hr 1 hr Stable Precipitation mm St-Pr2hr 2 hr Stable Precipitation mm St-Pr3hr 3 hr Stable Precipitation mm staName StaName stationId Station Id C stdDewpoint Std Dewpoint K stdMaxWindSpeed Std Max Wind Speed m/s stdSkyCover Std Sky Cover stdTemperature Std Temperature K stdWindDir Std Wind Direction stdWindSpeed Std Wind Speed m/s STP Sig. Tornado Parameter (>1 Sig Tor) STP1 Sig. Tornado Parameter (>1 Sig Tor) STRM Stream Function m^2/s StrmMot Storm Motion kn StrTP Strong Tornado Parameter m/s^2 SuCP Supercell Composite Parameter SUNSD Sunshine Duration s SuperLayerCompositeReflectivity Super Layer Composite Reflectivity (33-60 kft) dBZ SVV Sigma Coordinate Vertical Velocity /s SWDIR Direction of Swell Waves deg SWdir Swell Direction swe NOHRSC Snow Water Equivalent in SWELL Significant Height of Swell Waves m SWELL Swell Height m SWHR Solar Radiative Heating Rate K/s SWLEN Mean length of swell waves m SWPER Mean Period of Swell Waves s SWPER Swell Period s SWSTP Steepness of swell waves swtIdx Sweat Index SynPrecip24Hr SynPrecip24Hr mm SynthPrecipRateID QPE - Synthetic Precip Rate ID T Temperature K T Temperature K T24Chg T24Chg \u00b0F T24hr 24 hr Temperature K T_001 Prob of Temp Lstn 0C % T_001_bin Binary Prob of Temp Lstn 0C T_001_perts Prob of Temp Lstn 0C Perturbations T_avg Temperature Ensemble Mean K T_perts Temperature Perturbations K T_std Temperature Ensemble Std Dev K Ta Temperature Anomaly K TAdv Temperature Adv K/s Tc1 Prob Temp < O C % TCC Total Cloud Cover % TCCerranl Total Cloud Cover Error Analysis % TCICON Total Column-Integrated Condensate kg/m^2 TCLSW Total Column Integrated Supercooled Liquid Water kg/m^2 TCOLG Total Column Integrated Graupel kg/m^2 TCOLI Total Column-Integrated Cloud Ice kg/m^2 TCOLM Total Column Integrated Melting Ice kg/m^2 TCOLR Total Column Integrated Rain kg/m^2 TCOLS Total Column Integrated Snow kg/m^2 TCOLW Total Column-Integrated Cloud Water kg/m^2 TCOND Total Condensate kg/kg Tdef Total Deformation /s*100000.0 Tdend Dendritic Growth Temperatures K Terranl Temperature Analysis Uncertainty K Terranl Temperature Error Analysis K TGrd Temperature Gradient K/m TGrdM Temperature Grad Mag K/m ThetaE Theta E K ThGrd Temperature Gradient \u2103/m Thom5 S-R Flow Thom5a S-R Flow Thom6 S-R Flow Suggests Tor Supercells ThP Thunderstorm probability % ThP Thunderstorm Probability % ThP12hr 12hr Thunderstorm probability % ThP3hr 3hr Thunderstorm probability % ThP6hr 6hr Thunderstorm probability % ThPcat Categorical thunderstorm TiltAng Radar Tilt Angle deg TKE Turb Kin Energy J/kg TKE Turbulent Kinetic Energy J/kg Tmax Layer Max Temperature K TmDpD Temp minus Dewp Dep Tmean Temperature mean K Tmin Layer Min Temperature K Topo Topography m TORi BRNSHR,EHI,LRate>3C/km,CIN < 150 TORi2 BRNSHR,EHI,0-2km LRate > 3C/km TotQi Isentropic Total Moisture g\u00b7hPa/(kg\u00b7K) TOTSN 24hr Snowfall m TOTSN12hr 12hr Snowfall m TOZNE Total Ozone DU TP Precipitation mm TP Total Precipitation mm TP120hr 5 Day Total Gridded Precip in TP12c1 12-hr POP > 0.01 in % TP12c2 12-hr POP > 0.05 in % TP12c3 12-hr POP > 0.10 in % TP12c4 12-hr POP > 0.25 in % TP12c5 12-hr POP > 0.50 in % TP12c6 12-hr POP > 1.00 in % TP12c7 12-hr POP > 1.50 in % TP12c8 12-hr POP > 2.00 in % TP12hr 12 Hr Accum Precip mm TP12hr Total Precipitation(12 hours) mm TP12mean 12-hr Total Precip mean mm TP12sprd 12-hr Total Precip sprd mm TP168hr 7 Day Total Gridded Precip mm TP18hr Total Precipitation(18 hours) mm TP1hr 1 Hr Accum Precip mm TP1hr Total Precipitation(1 hour) mm TP24c1 24-hr POP > 0.01 in % TP24c2 24-hr POP > 0.05 in % TP24c3 24-hr POP > 0.10 in % TP24c4 24-hr POP > 0.25 in % TP24c5 24-hr POP > 0.50 in % TP24c6 24-hr POP > 1.00 in % TP24c7 24-hr POP > 1.50 in % TP24c8 24-hr POP > 2.00 in % TP24hr 24 Hr Accum Precip mm TP24hr Total Precipitation(24 hours) mm TP24hr_avg 24hr Precip Ensemble Mean mm TP24hr_perts 24hr Precip Perturbations mm TP24hr_std 24hr Precip Ensemble Std Dev mm TP24mean 24-hr Total Precip mean mm TP24sprd 24-hr Total Precip sprd mm TP36hr 36 Hr Accum Precip mm TP3c1 3-hr POP > 0.01 in % TP3c2 3-hr POP > 0.05 in % TP3c3 3-hr POP > 0.10 in % TP3c4 3-hr POP > 0.25 in % TP3c5 3-hr POP > 0.50 in % TP3c6 3-hr POP > 1.00 in % TP3c7 3-hr POP > 1.50 in % TP3c8 3-hr POP > 2.00 in % TP3hr 3 Hr Accum Precip mm TP3hr Total Precipitation(3 hours) mm TP3mean 3-hr Total Precip mean mm TP3sprd 3-hr Total Precip sprd mm TP48hr 48 Hr Accum Precip mm TP48hr Total Precipitation(48 hours) mm TP6c1 6-hr POP > 0.01 in % TP6c2 6-hr POP > 0.05 in % TP6c3 6-hr POP > 0.10 in % TP6c4 6-hr POP > 0.25 in % TP6c5 6-hr POP > 0.50 in % TP6c6 6-hr POP > 1.00 in % TP6c7 6-hr POP > 1.50 in % TP6c8 6-hr POP > 2.00 in % TP6hr 6 Hr Accum Precip mm TP6hr Total Precipitation(6 hours) mm TP6hr_avg 6hr Precip Ensemble Mean mm TP6hr_perts 6hr Precip Perturbations mm TP6hr_std 6hr Precip Ensemble Std Dev mm TP6mean 6-hr Total Precip mean mm TP6sprd 6-hr Total Precip sprd mm TP72hr 3 Day Total Gridded Precip mm TP9hr Total Precipitation(9 hours) mm TP_ACR ACR Precip in TP_ALR ALR Precip in TP_avg Precip Ensemble Mean mm TP_ECMWF ECMWF Precipitation in TP_ECMWF12hr ECMWF 12 Hr Accum Precip in TP_FWR FWR Precip in TP_HPC HPC Precip in TP_KRF KRF Precip in TP_MSR MSR Precip in TP_ORN ORN Precip in TP_perts Precip Perturbations mm TP_PTR PTR Precip in TP_RHA RHA Precip in TP_RSA RSA Precip in TP_std Precip Ensemble Std Dev mm TP_STR STR Precip in TP_TAR TAR Precip in TP_TIR TIR Precip in TP_TUA TUA Precip in TPFI Turbulence Index TPFI Turbulence Potential Forecast Index TP-GFS Total Precipitation for GFS mm tpHPC HPC Precip in tpHPCndfd Precipitation mm TPmodel Model Run Precip mm TPrun Run Accum Pcpn mm TPrun_avg Accum Precip Ensemble Mean mm TPrun_perts Accum Precip Perturbations mm TPrun_std Accum Precip Ensemble Std Dev mm TPx12x6 12-6 Hr Accum Precip mm TPx1x3 3x1 Hr Accum Precip mm TPx3 3 Hr Accum Precip mm TQIND TQ Index 12=Cold Pool 17=Embedded Convection C TRANS Transpiration W/m^2 transparentMaritimeSky ft transparentMaritimeSkySym ft transparentSky ft transparentSky2 ft transparentSky3 ft transparentSkySym ft transparentSkySym2 ft transparentSkySym3 ft TransWind TransWind kts TShrMi S=0-6km Shear Supports Scells TSLSA 3 hr Pres Change hPa TSNOW Total Snow kg/m^2 TSOIL Soil Temperature K Tsprd Temperature spread K TSRWE Total Snowfall Rate Water Equivalent kg/m^2/s Tstk Temp Stack K tTOT Total Totals C TURB Turbulence Index TV Virtual Temperature K TW Wet Bulb Temp K tWind Thermal Wind kn tWindU U Component of Thermal Wind kn tWindV V Component of Thermal Wind kn TwMax Layer Max Wet-bulb Temperature K TwMin Layer Min Wet-bulb Temperature K TWO Two Twstk Wet-bulb Temp Stack K TxSM Filtered-500km Temp C U-GWD Zonal Flux of Gravity Wave Stress N/m^2 UFLX Momentum Flux, U-Component N/m^2 uFX Geo Momentum m/s ulSnoRat ULWRF Comp Refl dBZ ULWRF Upward Long-Wave Rad. Flux W/m^2 UPHL Updraft Helicity m^2/s^2 USTM U-Component of Storm Motion m/s USWRF Reflectivity dBZ USWRF Upward Short-Wave Radiation Flux W/m^2 uv2 Horz Variance m^2/s^2 uW u Component of Wind m/s uW U-Component of Wind m/s uWerranl uWmean m/s uWsprd uWStk U Stack m/s uzfwc Upper Zone Free Water Content % uztwc Upper Zone Tension Water Content % V-GWD Meridional Flux of Gravity Wave Stress N/m^2 VAdv Vorticity Adv /s*1.0E9 VAdvAdvection Vorticity Adv /s VAPP Vapor Pressure Pa VBDSF Visible Beam Downward Solar Flux W/m^2 VEG Vegetation % vertCirc Vertical Circulation VFLX Momentum Flux, V-Component N/m^2 VGP Vort Gen Param VGTYP Vegetation Type Integer (0-13) VII Vertically Integrated Ice (VII) kg/m^2 VILIQ Vertically Integrated Liquid (VIL) kg/m^2 Vis Visibility m Vis Visibility m visbyIFR mi visbyLIFR mi visbyMVFR mi visbyVFR mi Visc1 Prob Sfc Visibility < 1 mile % Visc2 Prob Sfc Visibility < 3 miles % Visc23 Prob Sfc Visibility < 5 miles % visCat Categorical visibility Viserranl Visibility Analysis Uncertainty m Viserranl Visibility Error Analysis m Visible Visible Imagery VPT Virtual Potential Temperature K VRATE Ventilation Rate m^2/s vSmthW Verticall Smoothed Wind m/s VSS Vertical Shear Speed /s VSTM V-Component of Storm Motion m/s VTMP Virtual Temperature K vTOT Vertical Totals VUCSH Vertical u-component shear /s VV Vertical velocity m/s VVCSH Vertical v-component shear /s vW v Component of Wind m/s vW V-Component of Wind m/s vWerranl vWmean m/s vwpSample VWP Sample VWSH Vertical Speed Shear /s vWsprd vWStk V Stack m/s w2 Vert Variance m^2/s^2 WarmRainProbability Probability of Warm Rain % water_depth Hillslope Water Depth in WaterVapor Water Vapor Imagery K WATR Water Runoff kg/m^2 WCD Warm Cloud Depth Approx.: Frzlvl-LCL Thickness m WD Wind Direction (from which blowing) deg WD Wind direction deg WDea Wind Direction Analysis Uncertainity deg WDEPTH Geometric Depth Below Sea Surface m WDerranl Wind Direction Error Analysis deg wDiv Wind Divergence /s WDmean Wind Direction mean deg WEASD Water Equiv accum snow depth m WEASD Water Equivalent of Accumulated Snow Depth mm WGH 5-Wave Geopotential Height gpm WGH 5-wave geopotential height m WGS Wind Gust Speed m/s WGS Wind Gust Speed m/s WGS1hr Max 1-hr Wind Gust Speed m/s WGSea Wind Gust Speed Analysis Uncertainty m/s WGSerranl Wind Gust Speed Error Analysis m/s WGSMX1hr Max Hourly Wind Gust m/s WILT Wilting Point Proportion Wind Wind m/s Wind_avg Wind Ensemble Mean m/s Wind_perts Wind Perturbations m/s Windmean Mean Wind kn WINDPROB Wind Probability % WMIXE Wind Mixing Energy J WndChl Wind Chill K WS Wind Speed m/s WSc1 Prob SFC wind speed > 25 kt % WSc2 Prob SFC wind speed > 34 kt % WSc3 Prob SFC wind speed > 48 kt % WSc4 Prob SFC wind speed > 50 kt % WSc6 Prob SFC wind speed > 20 kt % WSc7 Prob SFC wind speed > 30 kt % WSc8 Prob SFC wind speed > 40 kt % WSerranl Wind Speed Error Analysis m/s WSmean Wind Speed mean m/s wSp Wind speed m/s wSp_001 Prob of Wind Grtn 40kts % wSp_001_bin Binary Prob of Wind Grtn 40kts wSp_001_perts Prob of Wind Grtn 40kts Perts wSp_002 Prob of Wind Grtn 50kts % wSp_002_bin Binary Prob of Wind Grtn 50kts wSp_002_perts Prob of Wind Grtn 50kts Perts wSp_003 Prob of Wind Grtn 60kts % wSp_003_bin Binary Prob of Wind Grtn 60kts wSp_003_perts Prob of Wind Grtn 60kts Perts wSp_004 Prob of Wind Grtn 30kts % wSp_004_bin Binary Prob of Wind Grtn 30kts wSp_004_perts Prob of Wind Grtn 30kts Perts wSp_avg Windspeed Ensemble Mean m/s wSp_perts Windspeed Perturbations m/s wSp_std Windspeed Ensemble Std Dev m/s wSpea Wind Speed Analysis Uncertainty kn wSpmean Mean Windspeed kt wSpsprd Windspread spread kt WSsprd Wind Speed sprd m/s WVDIR Direction of Wind Waves deg WVdir Wind Wave Direction wvHeight wvHeight m WVHGT Significant Height of Wind Waves m WVHGT Wind Wave Height m WVLEN Mean length of wind waves m WVPER Mean Period of Wind Waves s WVPER Wind Wave Period s wvPeriod wvPeriod WVSTP Steepness of wind waves wvType wvType wW w Component of Wind cm/s wx Weather zAGL Height AGL m ZDR Differential Reflectivity dB","title":"AWIPS Grid Parameters"},{"location":"appendix/appendix-wsr88d/","text":"Product Name Mnemonic ID Levels Res Elevation Reflectivity (Z) Z 19 16 100 .5 Reflectivity (Z) Z 19 16 100 1.5 Reflectivity (Z) Z 19 16 100 2.5 Reflectivity (Z) Z 19 16 100 3.5 Reflectivity (Z) Z 20 16 200 .5 Velocity (V) V 27 16 100 .5 Velocity (V) V 27 16 100 1.5 Velocity (V) V 27 16 100 2.5 Velocity (V) V 27 16 100 3.5 Storm Rel Velocity (SRM) SRM 56 16 100 .5 Storm Rel Velocity (SRM) SRM 56 16 100 1.5 Storm Rel Velocity (SRM) SRM 56 16 100 2.5 Storm Rel Velocity (SRM) SRM 56 16 100 3.5 Composite Ref (CZ) CZ 37 16 100 -1 Composite Ref (CZ) CZ 38 16 400 -1 Lyr Comp Ref Max (LRM) Level 1 LRM 65 8 0 -1 Lyr Comp Ref Max (LRM) Level 2 LRM 66 8 0 -1 Lyr Comp Ref Max (LRM) Level 3 LRM 90 8 0 -1 Lyr Comp Ref MAX (APR) APR 67 16 0 -1 Echo Tops (ET) ET 41 16 0 -1 Vert Integ Liq (VIL) VIL 57 16 0 -1 One Hour Precip (OHP) OHP 78 16 0 -1 Storm Total Precip (STP) STP 80 16 0 -1 VAD Wind Profile (VWP) VWP 48 0 0 -1 Digital Precip Array (DPA) DPA 81 256 400 -1 Velocity (V) V 25 16 100 .5 Base Spectrum Width (SW) SW 28 8 100 .5 Base Spectrum Width (SW) SW 30 8 100 .5 Severe Weather Probablilty (SWP) SWP 47 0 100 -1 Storm Tracking Information (STI) STI 58 0 100 -1 Hail Index (HI) HI 59 0 100 -1 Mesocyclone (M) M 60 0 100 -1 Mesocyclone (MD) MD 141 0 0 1 Tornadic Vortex Signature (TVS) TVS 61 0 100 -1 Storm Structure (SS) SS 62 0 100 -1 Supplemental Precipitation Data (SPD) SPD 82 0 100 -1 Reflectivity (Z) Z 94 256 100 .5 Reflectivity (Z) Z 94 256 100 1.5 Reflectivity (Z) Z 94 256 100 2.4 Reflectivity (Z) Z 94 256 100 3.4 Reflectivity (Z) Z 94 256 100 4.3 Reflectivity (Z) Z 94 256 100 5.3 Reflectivity (Z) Z 94 256 100 6.2 Reflectivity (Z) Z 94 256 100 7.5 Reflectivity (Z) Z 94 256 100 8.7 Reflectivity (Z) Z 94 256 100 10.0 Reflectivity (Z) Z 94 256 100 12.0 Reflectivity (Z) Z 94 256 100 14.0 Reflectivity (Z) Z 94 256 100 16.7 Reflectivity (Z) Z 94 256 100 19.5 Velocity (V) V 99 256 25 .5 Velocity (V) V 99 256 25 1.5 Velocity (V) V 99 256 25 2.4 Velocity (V) V 99 256 25 3.4 Velocity (V) V 99 256 25 4.3 Velocity (V) V 99 256 25 5.3 Velocity (V) V 99 256 25 6.2 Velocity (V) V 99 256 25 7.5 Velocity (V) V 99 256 25 8.7 Velocity (V) V 99 256 25 10.0 Velocity (V) V 99 256 25 12.0 Velocity (V) V 99 256 25 14.0 Velocity (V) V 99 256 25 16.7 Velocity (V) V 99 256 25 195 Super Res Reflectivity (Z) HZ 153 256 25 .5 Super Res Reflectivity (Z) HZ 153 256 25 1.5 Super Res Velocity (V) HV 154 256 25 .5 Super Res Velocity (V) HV 154 256 25 1.5 Super Res Spec Width (SW) HSW 155 256 25 .5 Super Res Spec Width (SW) HSW 155 256 25 1.5 Spectrum Width (SW) SW 30 8 100 1.5 Spectrum Width (SW) SW 28 8 25 1.5 Digital Vert Integ Liq (DVL) DVL 134 256 100 -1 Digital Hybrid Scan Refl (DHR) DHR 32 256 100 -1 Enhanced Echo Tops (EET) EET 135 256 100 -1 Digital Meso Detection (DMD) DMD 149 0 0 16384 TVS Rapid Update (TRU) TRU 143 0 0 16384 User Selectable Lyr Refl (ULR) ULR 137 16 100 -1 Storm Total Precip (STP) STP 138 256 200 -1 1-Hour Snow-Water Equiv (OSW) OSW 144 16 100 -1 1-Hour Snow Depth (OSD) OSD 145 16 100 -1 Storm Tot Snow Depth (SSD) SSD 147 16 100 -1 Storm Tot Snow-Water Equiv (SSW) SSW 146 16 100 -1 Differential Refl (ZDR) ZDR 158 16 100 .5 Differential Refl (ZDR) ZDR 159 256 25 16384 Correlation Coeff (CC) CC 160 16 100 .5 Correlation Coeff (CC) CC 161 256 25 16384 Specific Diff Phase (KDP) KDP 162 16 100 .5 Specific Diff Phase (KDP) KDP 163 256 25 16384 Hydrometeor Class (HC) HC 164 16 100 .5 Hydrometeor Class (HC) HC 165 256 25 16384 Melting Layer (ML) ML 166 0 0 16384 Hybrid Hydrometeor Class (HHC) HHC 177 256 25 -1 Digital Inst Precip Rate (DPR) DPR 176 0 25 -1 One Hour Accum (OHA) OHA 169 16 200 -1 User Select Accum (DUA) DUA 173 256 25 -1 User Select Accum (DUA) DUA 173 256 25 -1 Storm Total Accum (STA) STA 171 16 200 -1 Storm Total Accum (DSA) STA 172 256 25 -1 One Hour Diff (DOD) DOD 174 256 25 -1 Storm Total Diff (DSD) DSD 175 256 25 -1","title":"WSR-88D Product Table"},{"location":"appendix/common-problems/","text":"Common Problems \uf0c1 All Operating Systems \uf0c1 Removing caveData \uf0c1 Removing caveData (flushing the local cache) should be one of the first troubleshooting steps to take when experiencing weird behavior in CAVE. The cache lives in a folder called caveData , hence why this process is also referred to as removing or deleting caveData. Linux \uf0c1 For Linux users, the easiest way is to open a new terminal and run the following command: rm -rf ~/caveData Windows \uf0c1 For Windows users, simply delete the caveData folder in your home user directory: Mac \uf0c1 For Mac users, the easiest way is to open a new terminal and run the following command: rm -rf ~/Library/caveData Disappearing Configurations \uf0c1 If you ever notice some of the following settings you've configured/saved disappear from CAVE: Saved Displays or Procedures NSHARP settings (line thickness, etc) Colormap settings StyleRule settings This is not a fully exhaustive list, so if something else has disappeared it might be the same underlying issue still. Then it is likely we have recently changed our production EDEX server. There is a good chance we can recover your settings. To do so, please send a short email to support-awips@unidata.ucar.edu with the topic \"Missing Configurations\", and include the username(s) of the computer(s) you use to run CAVE. Remotely Connecting to CAVE \uf0c1 Since the pandemic began, many users have asked if they can use X11 forwarding or ssh tunneling to remotely connect to CAVE machines. This is not recommended or supported , and CAVE crashes in many different ways and expresses strange behavior as well. We highly recommend you download the appropriate CAVE installer on your local machine, if that is an option. If that is not an option, then the only remote access we recommend is using some type of VNC. RealVNC and nomachine are two options that are in use with positive outcomes. UltraVNC may be another option, but may have quite a delay. There may also be other free or paid software available that we are not aware of. It is likely that any VNC option you choose will also require some software or configuration to be set on the remote machine, and this will likely require administrative privileges. CAVE Spring Start Up Error \uf0c1 If you encounter the error below, please see one of our solution methods for resolving: CAVE's Spring container did not initialize correctly and CAVE must shut down. We have found the reason for this failure is because the host machine is set to use a language other than English (ie. Spanish, French, etc). To resolve this issue, either: Switch your system to English, when using CAVE or Use our Virtual Machine option . This option allows your actual machine to stay in whichever language you choose, while allowing you to run CAVE in an environment set to English. Although we list this installation under the Windows OS, this can also be done on Linux. The VM option has one notable drawback at the moment -- it cannot render RGB satellite products. Products Not Loading Properly \uf0c1 This problem is most commonly seen with the direct Windows installation. It can also manifest in the Mac installation (and is possible on Linux), and the root of the problem is not having Python installed properly for CAVE to use the packages. If the Windows installation was not completed properly, it is possible to see incorrect behavior when loading certain products. These are derived products which use the local machine to create and render the data. This creation is dependent upon python and its required packages working correctly. The dataset will be available in the menus and product browser, but when loaded, no data is drawn on the editor, but an entry is added to the legend. You may see an error that mentions the python package, jep . Known datasets this can affect (this is not a comprehensive list): Model Winds Metars Winds METAR Station Plot GFS Precip Type To correct this issue on Windows: Uninstall all related software (C++ Build Tools, Miniconda, Python, CAVE, pip, numpy, jep, etc) Redo all necessary installation instructions in steps 1 through 6 To correct this issue on Mac: Install the awips-python.pkg package found on step 1 To correct this issue on Linux: When running which python from a terminal, make sure /awips2/python/ is returned, if not, reset that environment variable, or re-run the awips_install.sh script from our installation instructions Windows \uf0c1 CAVE Map Display in Lower Left Quadrant - Windows \uf0c1 If you start up CAVE in Windows and notice the map is showing up only in the bottom left quadrant of your display, you will just need to tweak a few display settings. Try following these steps to fix your issue: Right-click on the CAVE.exe (or shortcut) icon, select Properties Select the Compatibility tab Click \"Change High DPI Settings\" At the bottom enable \"Override High DPI scaling behavior\" Change the dropdown from Application to System Windows CAVE Start Up Error \uf0c1 This should no longer be an issue for our v20 release of AWIPS. One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: Error purging logs Error instantiating workbench: null These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: These screenshots may vary from your system. These instructions are per connection , so if you use multiple connections or switch between wired and wireless connections, you'll need to do the following for each of those connections so that CAVE will always run properly. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. You may need to restart your machine for this to take effect 7. Restart CAVE. MacOS \uf0c1 Monterey CAVE Warning \uf0c1 If you are running MacOS Monterey, you may see the following message when starting CAVE: Monterey versions 12.3 or newer will not support our production (v18) CAVE. Please download and install our beta v20 CAVE for newer MacOS Versions to avoid this issue. White Boxes for Surface Resources \uf0c1 If you do not have an NVIDIA graphics card and driver, you may see \"boxes\" drawn on the editor for some of the products ( METARS Station Plots and Surface Winds are the resources we're aware of), as shown below: You may be able to fix this issue: Check what graphics cards are available on your machine, by going to the Apple menu (far left, upper corner) > About This Mac > Overview tab (default): If you see two entries at the Graphics line, like the image shown above, then you have two graphics cards on your system. Intel graphics cards may be able to render our products properly. In this case, you can \"force\" your computer to use the Intel card by running the following in a terminal: sudo pmset -[a|b|c] gpuswitch 0 Where [a|b|c] is only one of those options, which mean: a: adjust settings for all scenarios b: adjust settings while running off battery c: adjust settings while connected to charger The argument 0 sets the computer to use the dedicated GPU (in our case above the Intel GPU). The two other options for that argument are: 1: automatic graphics switching 2: integrated GPU It may be smart to run pmset -g first, so you can see what the current gpuswitch setting is (likely 1 ), that way you can revert the settings if you want them back to how they were, when not using CAVE. Linux \uf0c1 Troubleshooting Uninstalling EDEX \uf0c1 Sometimes yum can get in a weird state and not know what AWIPS groups have been installed. For example if you are trying to remove AWIPS you may see an error: yum groupremove \"AWIPS EDEX Server\" Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: mirror.dal.nexril.net * elrepo: ftp.osuosl.org * epel: mirrors.xmission.com * extras: mirrors.cat.pdx.edu * updates: mirror.mobap.edu No environment named AWIPS EDEX Server exists Maybe run: yum groups mark remove (see man yum) No packages to remove from groups To solve this issue, mark the group you want to remove and then try removing it again: yum groups mark remove \"AWIPS EDEX Server\" yum groupremove \"AWIPS EDEX Server\" Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 .","title":"Common Problems"},{"location":"appendix/common-problems/#common-problems","text":"","title":"Common Problems"},{"location":"appendix/common-problems/#all-operating-systems","text":"","title":"All Operating Systems"},{"location":"appendix/common-problems/#removing-cavedata","text":"Removing caveData (flushing the local cache) should be one of the first troubleshooting steps to take when experiencing weird behavior in CAVE. The cache lives in a folder called caveData , hence why this process is also referred to as removing or deleting caveData.","title":"Removing caveData"},{"location":"appendix/common-problems/#linux","text":"For Linux users, the easiest way is to open a new terminal and run the following command: rm -rf ~/caveData","title":"Linux"},{"location":"appendix/common-problems/#windows","text":"For Windows users, simply delete the caveData folder in your home user directory:","title":"Windows"},{"location":"appendix/common-problems/#mac","text":"For Mac users, the easiest way is to open a new terminal and run the following command: rm -rf ~/Library/caveData","title":"Mac"},{"location":"appendix/common-problems/#disappearing-configurations","text":"If you ever notice some of the following settings you've configured/saved disappear from CAVE: Saved Displays or Procedures NSHARP settings (line thickness, etc) Colormap settings StyleRule settings This is not a fully exhaustive list, so if something else has disappeared it might be the same underlying issue still. Then it is likely we have recently changed our production EDEX server. There is a good chance we can recover your settings. To do so, please send a short email to support-awips@unidata.ucar.edu with the topic \"Missing Configurations\", and include the username(s) of the computer(s) you use to run CAVE.","title":"Disappearing Configurations"},{"location":"appendix/common-problems/#remotely-connecting-to-cave","text":"Since the pandemic began, many users have asked if they can use X11 forwarding or ssh tunneling to remotely connect to CAVE machines. This is not recommended or supported , and CAVE crashes in many different ways and expresses strange behavior as well. We highly recommend you download the appropriate CAVE installer on your local machine, if that is an option. If that is not an option, then the only remote access we recommend is using some type of VNC. RealVNC and nomachine are two options that are in use with positive outcomes. UltraVNC may be another option, but may have quite a delay. There may also be other free or paid software available that we are not aware of. It is likely that any VNC option you choose will also require some software or configuration to be set on the remote machine, and this will likely require administrative privileges.","title":"Remotely Connecting to CAVE"},{"location":"appendix/common-problems/#cave-spring-start-up-error","text":"If you encounter the error below, please see one of our solution methods for resolving: CAVE's Spring container did not initialize correctly and CAVE must shut down. We have found the reason for this failure is because the host machine is set to use a language other than English (ie. Spanish, French, etc). To resolve this issue, either: Switch your system to English, when using CAVE or Use our Virtual Machine option . This option allows your actual machine to stay in whichever language you choose, while allowing you to run CAVE in an environment set to English. Although we list this installation under the Windows OS, this can also be done on Linux. The VM option has one notable drawback at the moment -- it cannot render RGB satellite products.","title":"CAVE Spring Start Up Error"},{"location":"appendix/common-problems/#products-not-loading-properly","text":"This problem is most commonly seen with the direct Windows installation. It can also manifest in the Mac installation (and is possible on Linux), and the root of the problem is not having Python installed properly for CAVE to use the packages. If the Windows installation was not completed properly, it is possible to see incorrect behavior when loading certain products. These are derived products which use the local machine to create and render the data. This creation is dependent upon python and its required packages working correctly. The dataset will be available in the menus and product browser, but when loaded, no data is drawn on the editor, but an entry is added to the legend. You may see an error that mentions the python package, jep . Known datasets this can affect (this is not a comprehensive list): Model Winds Metars Winds METAR Station Plot GFS Precip Type To correct this issue on Windows: Uninstall all related software (C++ Build Tools, Miniconda, Python, CAVE, pip, numpy, jep, etc) Redo all necessary installation instructions in steps 1 through 6 To correct this issue on Mac: Install the awips-python.pkg package found on step 1 To correct this issue on Linux: When running which python from a terminal, make sure /awips2/python/ is returned, if not, reset that environment variable, or re-run the awips_install.sh script from our installation instructions","title":"Products Not Loading Properly"},{"location":"appendix/common-problems/#windows_1","text":"","title":"Windows"},{"location":"appendix/common-problems/#cave-map-display-in-lower-left-quadrant-windows","text":"If you start up CAVE in Windows and notice the map is showing up only in the bottom left quadrant of your display, you will just need to tweak a few display settings. Try following these steps to fix your issue: Right-click on the CAVE.exe (or shortcut) icon, select Properties Select the Compatibility tab Click \"Change High DPI Settings\" At the bottom enable \"Override High DPI scaling behavior\" Change the dropdown from Application to System","title":"CAVE Map Display in Lower Left Quadrant - Windows"},{"location":"appendix/common-problems/#windows-cave-start-up-error","text":"This should no longer be an issue for our v20 release of AWIPS. One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: Error purging logs Error instantiating workbench: null These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: These screenshots may vary from your system. These instructions are per connection , so if you use multiple connections or switch between wired and wireless connections, you'll need to do the following for each of those connections so that CAVE will always run properly. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. You may need to restart your machine for this to take effect 7. Restart CAVE.","title":"Windows CAVE Start Up Error"},{"location":"appendix/common-problems/#macos","text":"","title":"MacOS"},{"location":"appendix/common-problems/#monterey-cave-warning","text":"If you are running MacOS Monterey, you may see the following message when starting CAVE: Monterey versions 12.3 or newer will not support our production (v18) CAVE. Please download and install our beta v20 CAVE for newer MacOS Versions to avoid this issue.","title":"Monterey CAVE Warning"},{"location":"appendix/common-problems/#white-boxes-for-surface-resources","text":"If you do not have an NVIDIA graphics card and driver, you may see \"boxes\" drawn on the editor for some of the products ( METARS Station Plots and Surface Winds are the resources we're aware of), as shown below: You may be able to fix this issue: Check what graphics cards are available on your machine, by going to the Apple menu (far left, upper corner) > About This Mac > Overview tab (default): If you see two entries at the Graphics line, like the image shown above, then you have two graphics cards on your system. Intel graphics cards may be able to render our products properly. In this case, you can \"force\" your computer to use the Intel card by running the following in a terminal: sudo pmset -[a|b|c] gpuswitch 0 Where [a|b|c] is only one of those options, which mean: a: adjust settings for all scenarios b: adjust settings while running off battery c: adjust settings while connected to charger The argument 0 sets the computer to use the dedicated GPU (in our case above the Intel GPU). The two other options for that argument are: 1: automatic graphics switching 2: integrated GPU It may be smart to run pmset -g first, so you can see what the current gpuswitch setting is (likely 1 ), that way you can revert the settings if you want them back to how they were, when not using CAVE.","title":"White Boxes for Surface Resources"},{"location":"appendix/common-problems/#linux_1","text":"","title":"Linux"},{"location":"appendix/common-problems/#troubleshooting-uninstalling-edex","text":"Sometimes yum can get in a weird state and not know what AWIPS groups have been installed. For example if you are trying to remove AWIPS you may see an error: yum groupremove \"AWIPS EDEX Server\" Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: mirror.dal.nexril.net * elrepo: ftp.osuosl.org * epel: mirrors.xmission.com * extras: mirrors.cat.pdx.edu * updates: mirror.mobap.edu No environment named AWIPS EDEX Server exists Maybe run: yum groups mark remove (see man yum) No packages to remove from groups To solve this issue, mark the group you want to remove and then try removing it again: yum groups mark remove \"AWIPS EDEX Server\" yum groupremove \"AWIPS EDEX Server\" Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 .","title":"Troubleshooting Uninstalling EDEX"},{"location":"appendix/educational-resources/","text":"Educational Resources \uf0c1 Here at Unidata, we want to provide as many resources as possible to make our tools and applications easy to use. For AWIPS we currently have a new eLearning course that is specific to CAVE. We also have a suite of Jupyter Notebooks that are meant to provide a detailed overview of many capabilities of python-awips. CAVE eLearning Course \uf0c1 Learn AWIPS CAVE is our online educational course for those interested in learning about CAVE. Access \uf0c1 Please create an account on Unidata eLearning , then self-enroll in Learn AWIPS CAVE . Content \uf0c1 Learn AWIPS CAVE is specifically tailored to content regarding CAVE -- the local graphical application used to view weather data. The following topics and capabilities are covered throughout the course: Launching CAVE Navigating the interface Modifying product appearances Understanding the time match basis Creating publication-quality graphics Exploring various CAVE layouts Saving and loading procedures and displays Using radar displays Using baselines and points Creating time series displays Creating vertical cross section displays Using the NSHARP editor for soundings Viewing model soundings Prerequisites \uf0c1 Required: A supported web browser CAVE version 18.2.1 installed on a supported operating system Recommended: A keyboard with a numpad and mouse with a scrollwheel Second monitor Design \uf0c1 Learn AWIPS CAVE is designed for those new to AWIPS or for those seeking to learn best practices. The course is organized into modular sections with supporting lessons, allowing for spaced learning or completion in multiple class or lab sessions. Each section concludes with a quiz to assess learning, and results can be requested by instructors or supervisors for their classes/teams. Below is a snapshot taken from the course. Lessons are tied to relevant learning objectives . Lessons are scaffolded such that each skill builds upon the next. Tutorials, challenges, and assessments are designed to support higher-order thinking skills and learning retention. Support \uf0c1 If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu Python-AWIPS eLearning Course \uf0c1 Learn Python-AWIPS is our online educational course for those interested in learning about Python-AWIPS . Access \uf0c1 Please create an account on Unidata eLearning , then self-enroll in Learn Python-AWIPS . Content \uf0c1 Learn Python-AWIPS is designed for new users of Python-AWIPS who have some background in both Python and CAVE. Through tutorials, challenges, and demonstrations, you will learn the basics for working with EDEX resources through Python. The following topics and capabilities are covered throughout the course: Programmatically explore the resources available on an EDEX server Make a request to an EDEX for data See examples of data manipulation Plot requested data Prerequisites \uf0c1 Required: A supported web browser Python3 Conda Git Python-AWIPS using the Source Code with Examples Install instructions Support \uf0c1 If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu Python-AWIPS Example Notebooks \uf0c1 In addition to CAVE, AWIPS also has a Python package called python-awips which allows access to all data on an EDEX server. We have created a suite of Jupyter Notebooks as examples for how to use various functions of python-awips. Access \uf0c1 All of our Notebooks can be downloaded and accessed locally by following the source code installation instructions found on our python-awips website . Additionally, non-interactive webpage renderings of each of the Notebooks are also available for quick and easy references. Content \uf0c1 Our python-awips Notebooks span a wide range of topics, but generally cover the following: Investigating what data is available on an EDEX server Accessing and filtering desired data based on time and location Plotting and analyzing datasets Specific examples for various data types: satellite imagery, model data, soundings, surface obs, and more YouTube Channel and Playlist \uf0c1 Unidata has a YouTube channel where we publish videos about all of our software pacakges. Specifically we also have a playlist dedicated to AWIPS videos. Access \uf0c1 All Unidata vidoes can be accessed here on our channel. All AWIPS vidoes can be found on the AWIPS Playlist . Content \uf0c1 Our AWIPS videos cover a wide range of topics, but include some of the following themes: AWIPS topic overviews Instructional videos (ex. how to install CAVE) In-depth walkthroughs on CAVE functionality Python-AWIPS notebook examples AWIPS Tips Blog Series \uf0c1 AWIPS Tips is a bi-weekly (every two weeks) blog series that is posted on our Unidata blogs page. Entries in the series cover topics relating to CAVE, python-awips, EDEX, and more. Access \uf0c1 View all of the AWIPS Tips blogs here , and easily search for them using the awips-tips tag. Please join our mailing list (awips2-users) to get the notifications of new AWIPS Tips when they come out! Content \uf0c1 A full list of all released blogs can be found below: General \uf0c1 Welcome to AWIPS Tips! AWIPS 18.2.1 Software Release Announcing AWIPS eLearning AWIPS 18.2.1-3 Software Release Access Learn AWIPS CAVE from Unidata eLearning AWIPS 18.2.1-5 Software Release GLM DATA IDD/LDM Feed Updates AWIPS 18.2.1-6 Software Release Unidata AWIPS Summer Internship 2022: Rhoen Fiutak Announcing a New eLearning Course: Learn Python-AWIPS Use Case Example: Texas A&M CAVE in the Classroom AWIPS 20.3.2-0.1 Beta CAVE Software Release AWIPS 20.3.2-0.2 Beta CAVE Software Release AWIPS 20.3.2-0.3 Beta CAVE Software Release AWIPS 20.3.2-0.4 Beta Software Release - with EDEX! CAVE \uf0c1 Visualizing Data in CAVE Display Capabilities in CAVE Time Tips Explore the CAVE Product Browser CAVE's Local Cache: caveData Explore the CAVE Volume Browser: Plan Views Using CAVE's Points and Baselines Tool Explore the CAVE Volume Browser: Cross Section and Time Series Using CAVE Displays and Procedures Getting Started With the NSHARP Display Tool Explore the CAVE Volume Browser: Model Soundings NUCAPS Soundings Import Shapefiles in CAVE Create Objective Analysis Plots Use Warngen to Draw Convective Warnings Using Drawing Properties for WWA Display in CAVE Understanding Graphic vs Image Products in CAVE Getting to Know CAVE's Display Properties Creating a User Override Frames in CAVE Panes in CAVE Image Combination with CAVE Colorized GOES CIRA Products Changing Localizations in CAVE All About Sampling Maps Database Constraints Python-AWIPS \uf0c1 Access Model Output with Python-AWIPS Plot New GOES Products From Unidata's Public EDEX Load Map Resources and Topography using Python-AWIPS Create a Colored Surface Temperature Plot Create Colorized Model Plots View WWA Polygons with Python-AWIPS Creating METAR Station Plots Create Sounding Plots with Model Data Plotting Multiple Datasets from EDEX Open Jupyter Notebooks with our Virtual Machine Visualizing Upper Air Soundings Compare Model Sounding Data in Python Beta Python-AWIPS Release EDEX \uf0c1 Get to Know EDEX EDEX Data Retention Adding ECMWF Data to EDEX Ingesting GOES Satellite Data Localization Levels in EDEX Porting Users CAVE Configurations Creating New Scales/Maps Adding Shapefiles to the Maps Menu with EDEX Removing Model Data from EDEX LDM Usage in AWIPS","title":"Educational Resources"},{"location":"appendix/educational-resources/#educational-resources","text":"Here at Unidata, we want to provide as many resources as possible to make our tools and applications easy to use. For AWIPS we currently have a new eLearning course that is specific to CAVE. We also have a suite of Jupyter Notebooks that are meant to provide a detailed overview of many capabilities of python-awips.","title":"Educational Resources"},{"location":"appendix/educational-resources/#cave-elearning-course","text":"Learn AWIPS CAVE is our online educational course for those interested in learning about CAVE.","title":"CAVE eLearning Course"},{"location":"appendix/educational-resources/#access","text":"Please create an account on Unidata eLearning , then self-enroll in Learn AWIPS CAVE .","title":"Access"},{"location":"appendix/educational-resources/#content","text":"Learn AWIPS CAVE is specifically tailored to content regarding CAVE -- the local graphical application used to view weather data. The following topics and capabilities are covered throughout the course: Launching CAVE Navigating the interface Modifying product appearances Understanding the time match basis Creating publication-quality graphics Exploring various CAVE layouts Saving and loading procedures and displays Using radar displays Using baselines and points Creating time series displays Creating vertical cross section displays Using the NSHARP editor for soundings Viewing model soundings","title":"Content"},{"location":"appendix/educational-resources/#prerequisites","text":"Required: A supported web browser CAVE version 18.2.1 installed on a supported operating system Recommended: A keyboard with a numpad and mouse with a scrollwheel Second monitor","title":"Prerequisites"},{"location":"appendix/educational-resources/#design","text":"Learn AWIPS CAVE is designed for those new to AWIPS or for those seeking to learn best practices. The course is organized into modular sections with supporting lessons, allowing for spaced learning or completion in multiple class or lab sessions. Each section concludes with a quiz to assess learning, and results can be requested by instructors or supervisors for their classes/teams. Below is a snapshot taken from the course. Lessons are tied to relevant learning objectives . Lessons are scaffolded such that each skill builds upon the next. Tutorials, challenges, and assessments are designed to support higher-order thinking skills and learning retention.","title":"Design"},{"location":"appendix/educational-resources/#support","text":"If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu","title":"Support"},{"location":"appendix/educational-resources/#python-awips-elearning-course","text":"Learn Python-AWIPS is our online educational course for those interested in learning about Python-AWIPS .","title":"Python-AWIPS eLearning Course"},{"location":"appendix/educational-resources/#access_1","text":"Please create an account on Unidata eLearning , then self-enroll in Learn Python-AWIPS .","title":"Access"},{"location":"appendix/educational-resources/#content_1","text":"Learn Python-AWIPS is designed for new users of Python-AWIPS who have some background in both Python and CAVE. Through tutorials, challenges, and demonstrations, you will learn the basics for working with EDEX resources through Python. The following topics and capabilities are covered throughout the course: Programmatically explore the resources available on an EDEX server Make a request to an EDEX for data See examples of data manipulation Plot requested data","title":"Content"},{"location":"appendix/educational-resources/#prerequisites_1","text":"Required: A supported web browser Python3 Conda Git Python-AWIPS using the Source Code with Examples Install instructions","title":"Prerequisites"},{"location":"appendix/educational-resources/#support_1","text":"If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu","title":"Support"},{"location":"appendix/educational-resources/#python-awips-example-notebooks","text":"In addition to CAVE, AWIPS also has a Python package called python-awips which allows access to all data on an EDEX server. We have created a suite of Jupyter Notebooks as examples for how to use various functions of python-awips.","title":"Python-AWIPS Example Notebooks"},{"location":"appendix/educational-resources/#access_2","text":"All of our Notebooks can be downloaded and accessed locally by following the source code installation instructions found on our python-awips website . Additionally, non-interactive webpage renderings of each of the Notebooks are also available for quick and easy references.","title":"Access"},{"location":"appendix/educational-resources/#content_2","text":"Our python-awips Notebooks span a wide range of topics, but generally cover the following: Investigating what data is available on an EDEX server Accessing and filtering desired data based on time and location Plotting and analyzing datasets Specific examples for various data types: satellite imagery, model data, soundings, surface obs, and more","title":"Content"},{"location":"appendix/educational-resources/#youtube-channel-and-playlist","text":"Unidata has a YouTube channel where we publish videos about all of our software pacakges. Specifically we also have a playlist dedicated to AWIPS videos.","title":"YouTube Channel and Playlist"},{"location":"appendix/educational-resources/#access_3","text":"All Unidata vidoes can be accessed here on our channel. All AWIPS vidoes can be found on the AWIPS Playlist .","title":"Access"},{"location":"appendix/educational-resources/#content_3","text":"Our AWIPS videos cover a wide range of topics, but include some of the following themes: AWIPS topic overviews Instructional videos (ex. how to install CAVE) In-depth walkthroughs on CAVE functionality Python-AWIPS notebook examples","title":"Content"},{"location":"appendix/educational-resources/#awips-tips-blog-series","text":"AWIPS Tips is a bi-weekly (every two weeks) blog series that is posted on our Unidata blogs page. Entries in the series cover topics relating to CAVE, python-awips, EDEX, and more.","title":"AWIPS Tips Blog Series"},{"location":"appendix/educational-resources/#access_4","text":"View all of the AWIPS Tips blogs here , and easily search for them using the awips-tips tag. Please join our mailing list (awips2-users) to get the notifications of new AWIPS Tips when they come out!","title":"Access"},{"location":"appendix/educational-resources/#content_4","text":"A full list of all released blogs can be found below:","title":"Content"},{"location":"appendix/educational-resources/#general","text":"Welcome to AWIPS Tips! AWIPS 18.2.1 Software Release Announcing AWIPS eLearning AWIPS 18.2.1-3 Software Release Access Learn AWIPS CAVE from Unidata eLearning AWIPS 18.2.1-5 Software Release GLM DATA IDD/LDM Feed Updates AWIPS 18.2.1-6 Software Release Unidata AWIPS Summer Internship 2022: Rhoen Fiutak Announcing a New eLearning Course: Learn Python-AWIPS Use Case Example: Texas A&M CAVE in the Classroom AWIPS 20.3.2-0.1 Beta CAVE Software Release AWIPS 20.3.2-0.2 Beta CAVE Software Release AWIPS 20.3.2-0.3 Beta CAVE Software Release AWIPS 20.3.2-0.4 Beta Software Release - with EDEX!","title":"General"},{"location":"appendix/educational-resources/#cave","text":"Visualizing Data in CAVE Display Capabilities in CAVE Time Tips Explore the CAVE Product Browser CAVE's Local Cache: caveData Explore the CAVE Volume Browser: Plan Views Using CAVE's Points and Baselines Tool Explore the CAVE Volume Browser: Cross Section and Time Series Using CAVE Displays and Procedures Getting Started With the NSHARP Display Tool Explore the CAVE Volume Browser: Model Soundings NUCAPS Soundings Import Shapefiles in CAVE Create Objective Analysis Plots Use Warngen to Draw Convective Warnings Using Drawing Properties for WWA Display in CAVE Understanding Graphic vs Image Products in CAVE Getting to Know CAVE's Display Properties Creating a User Override Frames in CAVE Panes in CAVE Image Combination with CAVE Colorized GOES CIRA Products Changing Localizations in CAVE All About Sampling Maps Database Constraints","title":"CAVE"},{"location":"appendix/educational-resources/#python-awips","text":"Access Model Output with Python-AWIPS Plot New GOES Products From Unidata's Public EDEX Load Map Resources and Topography using Python-AWIPS Create a Colored Surface Temperature Plot Create Colorized Model Plots View WWA Polygons with Python-AWIPS Creating METAR Station Plots Create Sounding Plots with Model Data Plotting Multiple Datasets from EDEX Open Jupyter Notebooks with our Virtual Machine Visualizing Upper Air Soundings Compare Model Sounding Data in Python Beta Python-AWIPS Release","title":"Python-AWIPS"},{"location":"appendix/educational-resources/#edex","text":"Get to Know EDEX EDEX Data Retention Adding ECMWF Data to EDEX Ingesting GOES Satellite Data Localization Levels in EDEX Porting Users CAVE Configurations Creating New Scales/Maps Adding Shapefiles to the Maps Menu with EDEX Removing Model Data from EDEX LDM Usage in AWIPS","title":"EDEX"},{"location":"appendix/maps-database/","text":"mapdata.airport \uf0c1 Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok mapdata.allrivers \uf0c1 Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326) mapdata.artcc \uf0c1 Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326) mapdata.basins \uf0c1 Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326) mapdata.canada \uf0c1 Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326) mapdata.city \uf0c1 Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326) mapdata.county \uf0c1 Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.customlocations \uf0c1 Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.cwa \uf0c1 Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.firewxaor \uf0c1 Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.firewxzones \uf0c1 Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.fix \uf0c1 Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) mapdata.highaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.highsea \uf0c1 Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.highway \uf0c1 Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.hsa \uf0c1 Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.interstate \uf0c1 Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.isc \uf0c1 Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.lake \uf0c1 Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.latlon10 \uf0c1 Column Type the_geom geometry(MultiLineString,4326) mapdata.lowaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.majorrivers \uf0c1 Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326) mapdata.marinesites \uf0c1 Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326) mapdata.marinezones \uf0c1 Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.mexico \uf0c1 Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326) mapdata.navaid \uf0c1 Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326) mapdata.offshore \uf0c1 Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326) mapdata.railroad \uf0c1 Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326) mapdata.rfc \uf0c1 Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.specialuse \uf0c1 Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.states \uf0c1 Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.timezones \uf0c1 Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.warngenloc \uf0c1 Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326) mapdata.world \uf0c1 Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326) mapdata.zone \uf0c1 Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"Maps database"},{"location":"appendix/maps-database/#mapdataairport","text":"Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok","title":"mapdata.airport"},{"location":"appendix/maps-database/#mapdataallrivers","text":"Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326)","title":"mapdata.allrivers"},{"location":"appendix/maps-database/#mapdataartcc","text":"Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.artcc"},{"location":"appendix/maps-database/#mapdatabasins","text":"Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326)","title":"mapdata.basins"},{"location":"appendix/maps-database/#mapdatacanada","text":"Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.canada"},{"location":"appendix/maps-database/#mapdatacity","text":"Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326)","title":"mapdata.city"},{"location":"appendix/maps-database/#mapdatacounty","text":"Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.county"},{"location":"appendix/maps-database/#mapdatacustomlocations","text":"Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.customlocations"},{"location":"appendix/maps-database/#mapdatacwa","text":"Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.cwa"},{"location":"appendix/maps-database/#mapdatafirewxaor","text":"Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxaor"},{"location":"appendix/maps-database/#mapdatafirewxzones","text":"Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxzones"},{"location":"appendix/maps-database/#mapdatafix","text":"Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.fix"},{"location":"appendix/maps-database/#mapdatahighaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.highaltitude"},{"location":"appendix/maps-database/#mapdatahighsea","text":"Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.highsea"},{"location":"appendix/maps-database/#mapdatahighway","text":"Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.highway"},{"location":"appendix/maps-database/#mapdatahsa","text":"Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.hsa"},{"location":"appendix/maps-database/#mapdatainterstate","text":"Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.interstate"},{"location":"appendix/maps-database/#mapdataisc","text":"Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.isc"},{"location":"appendix/maps-database/#mapdatalake","text":"Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.lake"},{"location":"appendix/maps-database/#mapdatalatlon10","text":"Column Type the_geom geometry(MultiLineString,4326)","title":"mapdata.latlon10"},{"location":"appendix/maps-database/#mapdatalowaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.lowaltitude"},{"location":"appendix/maps-database/#mapdatamajorrivers","text":"Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326)","title":"mapdata.majorrivers"},{"location":"appendix/maps-database/#mapdatamarinesites","text":"Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326)","title":"mapdata.marinesites"},{"location":"appendix/maps-database/#mapdatamarinezones","text":"Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.marinezones"},{"location":"appendix/maps-database/#mapdatamexico","text":"Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326)","title":"mapdata.mexico"},{"location":"appendix/maps-database/#mapdatanavaid","text":"Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.navaid"},{"location":"appendix/maps-database/#mapdataoffshore","text":"Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326)","title":"mapdata.offshore"},{"location":"appendix/maps-database/#mapdatarailroad","text":"Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326)","title":"mapdata.railroad"},{"location":"appendix/maps-database/#mapdatarfc","text":"Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.rfc"},{"location":"appendix/maps-database/#mapdataspecialuse","text":"Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.specialuse"},{"location":"appendix/maps-database/#mapdatastates","text":"Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.states"},{"location":"appendix/maps-database/#mapdatatimezones","text":"Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.timezones"},{"location":"appendix/maps-database/#mapdatawarngenloc","text":"Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326)","title":"mapdata.warngenloc"},{"location":"appendix/maps-database/#mapdataworld","text":"Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326)","title":"mapdata.world"},{"location":"appendix/maps-database/#mapdatazone","text":"Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.zone"},{"location":"cave/bundles-and-procedures/","text":"Displays and Procedures \uf0c1 AWIPS contains two methods for saving and loading data resources: Displays are an all-encompasing way to save loaded resources and current view configurations either onto the connected EDEX server, or a local file for access in future CAVE sessions. Procedures are similar to Displays, but can be thought of as groups of procedure items which allow the user to save/load only parts of the procedure they desire, and allows the user to manage saved resources with more control. Displays \uf0c1 File > Load Display \uf0c1 Load a previously-saved display from within the AWIPS system. The pop-up dialog allows you to select your own saved displays as well as those saved by other users. When loading a display, all existing tabs will be closed and replaced with the contents from the saved display. Displays will load as many Map Editor tabs as existed when the display was originally saved. Load Display from Local Disk \uf0c1 To load a previously-saved display from a path within the file directory locally, select File > Load Display and then select the File button on the right to browse your local directories. File > Save Display \uf0c1 Save a product display within the AWIPS system. This saves the display to the EDEX server for your specific user. File > Save Display Locally \uf0c1 To save a product display to a path within the file directory locally, select File > Save Display Locally and then select the File button on the right. File > Delete Displays \uf0c1 Select and remove a saved display under File > Delete Displays , this will open a pop-up dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently. Procedures \uf0c1 New Procedure \uf0c1 Select the menu File > Procedures > New... Select Copy Into to add all loaded resources from your current map to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save. Open Procedure \uf0c1 Similar to creating a new Procedure, select File > Procedures > Open... , select the saved resources and click Load to load them to the current Map Editor tab. If multiple procedure items are wanted for loading, create a new tab for each procedure item and then load that item into the tab. This process is shown in the video below. Delete Procedure \uf0c1 From the menu File > Procedures > Delete... you can delete existing Procedure files in a way similar to deleting saved display files.","title":"Displays and Procedures"},{"location":"cave/bundles-and-procedures/#displays-and-procedures","text":"AWIPS contains two methods for saving and loading data resources: Displays are an all-encompasing way to save loaded resources and current view configurations either onto the connected EDEX server, or a local file for access in future CAVE sessions. Procedures are similar to Displays, but can be thought of as groups of procedure items which allow the user to save/load only parts of the procedure they desire, and allows the user to manage saved resources with more control.","title":"Displays and Procedures"},{"location":"cave/bundles-and-procedures/#displays","text":"","title":"Displays"},{"location":"cave/bundles-and-procedures/#file-load-display","text":"Load a previously-saved display from within the AWIPS system. The pop-up dialog allows you to select your own saved displays as well as those saved by other users. When loading a display, all existing tabs will be closed and replaced with the contents from the saved display. Displays will load as many Map Editor tabs as existed when the display was originally saved.","title":"File > Load Display"},{"location":"cave/bundles-and-procedures/#load-display-from-local-disk","text":"To load a previously-saved display from a path within the file directory locally, select File > Load Display and then select the File button on the right to browse your local directories.","title":"Load Display from Local Disk"},{"location":"cave/bundles-and-procedures/#file-save-display","text":"Save a product display within the AWIPS system. This saves the display to the EDEX server for your specific user.","title":"File > Save Display"},{"location":"cave/bundles-and-procedures/#file-save-display-locally","text":"To save a product display to a path within the file directory locally, select File > Save Display Locally and then select the File button on the right.","title":"File > Save Display Locally"},{"location":"cave/bundles-and-procedures/#file-delete-displays","text":"Select and remove a saved display under File > Delete Displays , this will open a pop-up dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently.","title":"File > Delete Displays"},{"location":"cave/bundles-and-procedures/#procedures","text":"","title":"Procedures"},{"location":"cave/bundles-and-procedures/#new-procedure","text":"Select the menu File > Procedures > New... Select Copy Into to add all loaded resources from your current map to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save.","title":"New Procedure"},{"location":"cave/bundles-and-procedures/#open-procedure","text":"Similar to creating a new Procedure, select File > Procedures > Open... , select the saved resources and click Load to load them to the current Map Editor tab. If multiple procedure items are wanted for loading, create a new tab for each procedure item and then load that item into the tab. This process is shown in the video below.","title":"Open Procedure"},{"location":"cave/bundles-and-procedures/#delete-procedure","text":"From the menu File > Procedures > Delete... you can delete existing Procedure files in a way similar to deleting saved display files.","title":"Delete Procedure"},{"location":"cave/cave-keyboard-shortcuts/","text":"Keyboard Shortcuts \uf0c1 D2D Menu Shortcuts \uf0c1 Action Command Open a New Map Ctrl + N Open a Display Ctrl + O Save Display Ctrl + S Save Display Locally Ctrl + Shift + S Save KML Ctrl + K Exit CAVE Alt + F4 Exit CAVE Ctrl + Q Clear Data Ctrl + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2190 Increase Loop Speed Page Up Decrease Loop Speed Page Down Open Time Options Ctrl + T Toggle Image Combination Insert Open Loop Properties Ctrl + L Open Image Properties Ctrl + I D2D All Tilts Shortcuts \uf0c1 Note : Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193 D2D Numeric Keypad Shortcuts \uf0c1 Note : Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1 and 2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter Panel Combo Rotate (PCR) Shortcuts \uf0c1 Note : These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8 Text Editor Shortcuts \uf0c1 Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X","title":"Keyboard Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#keyboard-shortcuts","text":"","title":"Keyboard Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-menu-shortcuts","text":"Action Command Open a New Map Ctrl + N Open a Display Ctrl + O Save Display Ctrl + S Save Display Locally Ctrl + Shift + S Save KML Ctrl + K Exit CAVE Alt + F4 Exit CAVE Ctrl + Q Clear Data Ctrl + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2190 Increase Loop Speed Page Up Decrease Loop Speed Page Down Open Time Options Ctrl + T Toggle Image Combination Insert Open Loop Properties Ctrl + L Open Image Properties Ctrl + I","title":"D2D Menu Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-all-tilts-shortcuts","text":"Note : Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193","title":"D2D All Tilts Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-numeric-keypad-shortcuts","text":"Note : Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1 and 2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter","title":"D2D Numeric Keypad Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#panel-combo-rotate-pcr-shortcuts","text":"Note : These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8","title":"Panel Combo Rotate (PCR) Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#text-editor-shortcuts","text":"Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X","title":"Text Editor Shortcuts"},{"location":"cave/cave-localization/","text":"Change Localization \uf0c1 Localization Preferences \uf0c1 The default localization site for Unidata AWIPS is OAX (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. Since release 16.1.4, CAVE users can switch the localization site to any valid NWS WFO from CAVE > Preferences > Localization , where edits can be made to both the site ID and EDEX server name. Click Restart after changes are applied. This window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.","title":"Change Localization"},{"location":"cave/cave-localization/#change-localization","text":"","title":"Change Localization"},{"location":"cave/cave-localization/#localization-preferences","text":"The default localization site for Unidata AWIPS is OAX (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. Since release 16.1.4, CAVE users can switch the localization site to any valid NWS WFO from CAVE > Preferences > Localization , where edits can be made to both the site ID and EDEX server name. Click Restart after changes are applied. This window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.","title":"Localization Preferences"},{"location":"cave/cave-perspectives/","text":"D2D \uf0c1 D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. Frame control, map projection, image properties, and a few featured applications make up the the primary D2D toolbar. CONUS is the default map display of the continental United States in a North Polar Stereographic projection. This menu allows you to select different projections. Clear will remove all non-system resources (meaning data) while preserving any map overlays you have added to the view. is a shortcut to Image Properties for the top-loaded image resource in the stack. freezes and un-freezes panning (movement) of the map. Valid time seq is the default time-matching setting for loading data. Select this menu to switch to configurations such as Latest, No Backfill, Previous run, Prognosis loop, and more. controls the frame number, display, speed, etc. You can also control the frames with the left and right keyboard keys. Application links to Warngen , Ncdata (NCP GEMPAK-like grids), Nsharp , and the Product Browser are also available. Switching Perspectives \uf0c1 D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , Hydro , Localization , MPE , or National Centers Perspective (which is available in the Other... submenu. Nobody seems to know why the NCP is not listed with the other perspectives, or how to make it appear with them). Resource Stack \uf0c1 At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . Left-Click Resource Name to Hide \uf0c1 A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible. Hold-Right-Click Resource Name for Menu \uf0c1 Drag the mouse over a loaded resource and hold the right mouse button until a menu appears (simply clicking the resource with the right mouse button will toggle its visibility). The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. Hold-Right-Click the Map Background \uf0c1 for additional options, such as greater control over the resource stack legend, toggling a 4-panel display , selecting a Zoom level, and setting a Background Color . Most loaded resources will also have a menu option for reading out the pixel values: Product Browser \uf0c1 The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. Selections for GFE , Grids , Lightning , Map Overlays , Radar , Satellite , Redbook , and VIIRS are available. All products loaded with the Product Browser are given default settings.","title":"D2D"},{"location":"cave/cave-perspectives/#d2d","text":"D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. Frame control, map projection, image properties, and a few featured applications make up the the primary D2D toolbar. CONUS is the default map display of the continental United States in a North Polar Stereographic projection. This menu allows you to select different projections. Clear will remove all non-system resources (meaning data) while preserving any map overlays you have added to the view. is a shortcut to Image Properties for the top-loaded image resource in the stack. freezes and un-freezes panning (movement) of the map. Valid time seq is the default time-matching setting for loading data. Select this menu to switch to configurations such as Latest, No Backfill, Previous run, Prognosis loop, and more. controls the frame number, display, speed, etc. You can also control the frames with the left and right keyboard keys. Application links to Warngen , Ncdata (NCP GEMPAK-like grids), Nsharp , and the Product Browser are also available.","title":"D2D"},{"location":"cave/cave-perspectives/#switching-perspectives","text":"D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , Hydro , Localization , MPE , or National Centers Perspective (which is available in the Other... submenu. Nobody seems to know why the NCP is not listed with the other perspectives, or how to make it appear with them).","title":"Switching Perspectives"},{"location":"cave/cave-perspectives/#resource-stack","text":"At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu .","title":"Resource Stack"},{"location":"cave/cave-perspectives/#left-click-resource-name-to-hide","text":"A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.","title":"Left-Click Resource Name to Hide"},{"location":"cave/cave-perspectives/#hold-right-click-resource-name-for-menu","text":"Drag the mouse over a loaded resource and hold the right mouse button until a menu appears (simply clicking the resource with the right mouse button will toggle its visibility). The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources.","title":"Hold-Right-Click Resource Name for Menu"},{"location":"cave/cave-perspectives/#hold-right-click-the-map-background","text":"for additional options, such as greater control over the resource stack legend, toggling a 4-panel display , selecting a Zoom level, and setting a Background Color . Most loaded resources will also have a menu option for reading out the pixel values:","title":"Hold-Right-Click the Map Background"},{"location":"cave/cave-perspectives/#product-browser","text":"The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. Selections for GFE , Grids , Lightning , Map Overlays , Radar , Satellite , Redbook , and VIIRS are available. All products loaded with the Product Browser are given default settings.","title":"Product Browser"},{"location":"cave/d2d-edit-menus/","text":"Editing Menus \uf0c1 Any of the menus in the menubar can be customized in the Localization Perspective . Modifying Menus \uf0c1 Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and select Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS Removing Menus \uf0c1 This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change to With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.","title":"Editing Menus"},{"location":"cave/d2d-edit-menus/#editing-menus","text":"Any of the menus in the menubar can be customized in the Localization Perspective .","title":"Editing Menus"},{"location":"cave/d2d-edit-menus/#modifying-menus","text":"Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and select Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS","title":"Modifying Menus"},{"location":"cave/d2d-edit-menus/#removing-menus","text":"This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change to With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.","title":"Removing Menus"},{"location":"cave/d2d-gis-shapefiles/","text":"GIS Import \uf0c1 The Geographic Information System (GIS) Import menu entry enables users to import geospatial data from varying GIS data sources for display in CAVE. CAVE currently only supports shape data in WGS84 unprojected latitude/longitude. This section describes how to: Load GIS Data in CAVE Modify the GIS Data Preferences Customize the Attributes Label GIS Data Display GIS Data \uf0c1 Importing a GIS shapefile is accessed through File > Import > GIS Data . The GIS DataStore Parameters dialog is comprised of four sections: DataStore Type : You can select a file type from the dropdown list. The only option is GIS File . Connection Parameters : Click the Browse button and navigate to the directory where your shapefiles are. Pressing Connect will populate the available shapefiles. Load As : Shapefiles can be loaded as a Map or as a Product. Map : The selected shapefile displays as a map, similar to if you load a map from the Maps menu. Product : When this radio button is selected, you will also need to select the start and end date/time the data is valid for. The selected shapefile displays as a product with a shaded (color-filled) image. When plotting with additional products, if the display time falls within the start/end time range selected, the shapefile will display. When the valid time falls outside the start/end time, the map product image does not display. Table : This section lists all of the available shapefiles that are available for display. GIS Data Preferences \uf0c1 Updating GIS display preferences is accessed through CAVE > Preferences > GIS Viewer . You are able to alter the highlight color, style, width, and opacity of the product in the Main Display here. Customizing the GIS Attribute Dialog \uf0c1 You have the ability to highlight or hide specific areas of the displayed map. These functionalities are available by right click and holding on the Map Product ID in the Legend area and selecting Display Attributes . The pop-up window is commonly referred to as the \"Attributes Table\". For each row of information there is an associated map/map product image displayed on the Main Display Pane. Highlighting \uf0c1 Highlighting Selected Areas \uf0c1 To highlight a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Highlighted checkbox. Active highlighted rows will be yellow in the table and the corresponding area in the map display will be pink. Unhighlighting Selected Areas \uf0c1 You can unhighlight by selecting the row, right mouse hold and uncheck the Highlighted checkbox. Unhighlighting All Areas \uf0c1 To remove all highlighted, select Annotation > Clear Highlights . If you are interested in a particular area in the Main Display Pane, but don't know the where in the Attributes Table it is, left double-click on the area of interest and the corresponding row will be highlighted. Controlling Visibility of Image Areas \uf0c1 Hiding Selected Areas \uf0c1 To hide a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Visible checkbox. Hidden rows will be gray in the table and the corresponding area in the map display will disappear. Unhiding Selected Areas \uf0c1 You can make these images visible by selecting the row, right mouse hold and check the Visible checkbox. Unhiding All Areas \uf0c1 To make all images visible, select Annotation > Make All Visible . Configuring Attributes Table \uf0c1 In the Attributes Table, you have the option to sort by columns and select which columns are displayed. Selecting Columns to Display \uf0c1 By default, all available columns are displayed. The Select Columns dialog will pop-up if you select Data > Select Columns... . You can highlight the columns use the arrows to move them into the Available or Displayed columns. Clicking OK will update your table. Sorting Column Information \uf0c1 The Sort Order dialog will pop-up if you select Data > Sort... . You can use the drop down menu to choose the column to sort by and then sort by Ascending or Descending. You can sort by additional columns. Clicking OK will update your table. Labeling GIS Data \uf0c1 You can select which attribute you want to use to label the objects on the Main Display. To open the Label submenu, right click and hold on the Map Product ID in the Legend area to open a pop-up menu and select Label and choose which attribute you want as the label.","title":"GIS and Shapefiles"},{"location":"cave/d2d-gis-shapefiles/#gis-import","text":"The Geographic Information System (GIS) Import menu entry enables users to import geospatial data from varying GIS data sources for display in CAVE. CAVE currently only supports shape data in WGS84 unprojected latitude/longitude. This section describes how to: Load GIS Data in CAVE Modify the GIS Data Preferences Customize the Attributes Label GIS Data","title":"GIS Import"},{"location":"cave/d2d-gis-shapefiles/#display-gis-data","text":"Importing a GIS shapefile is accessed through File > Import > GIS Data . The GIS DataStore Parameters dialog is comprised of four sections: DataStore Type : You can select a file type from the dropdown list. The only option is GIS File . Connection Parameters : Click the Browse button and navigate to the directory where your shapefiles are. Pressing Connect will populate the available shapefiles. Load As : Shapefiles can be loaded as a Map or as a Product. Map : The selected shapefile displays as a map, similar to if you load a map from the Maps menu. Product : When this radio button is selected, you will also need to select the start and end date/time the data is valid for. The selected shapefile displays as a product with a shaded (color-filled) image. When plotting with additional products, if the display time falls within the start/end time range selected, the shapefile will display. When the valid time falls outside the start/end time, the map product image does not display. Table : This section lists all of the available shapefiles that are available for display.","title":"Display GIS Data"},{"location":"cave/d2d-gis-shapefiles/#gis-data-preferences","text":"Updating GIS display preferences is accessed through CAVE > Preferences > GIS Viewer . You are able to alter the highlight color, style, width, and opacity of the product in the Main Display here.","title":"GIS Data Preferences"},{"location":"cave/d2d-gis-shapefiles/#customizing-the-gis-attribute-dialog","text":"You have the ability to highlight or hide specific areas of the displayed map. These functionalities are available by right click and holding on the Map Product ID in the Legend area and selecting Display Attributes . The pop-up window is commonly referred to as the \"Attributes Table\". For each row of information there is an associated map/map product image displayed on the Main Display Pane.","title":"Customizing the GIS Attribute Dialog"},{"location":"cave/d2d-gis-shapefiles/#highlighting","text":"","title":"Highlighting"},{"location":"cave/d2d-gis-shapefiles/#highlighting-selected-areas","text":"To highlight a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Highlighted checkbox. Active highlighted rows will be yellow in the table and the corresponding area in the map display will be pink.","title":"Highlighting Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhighlighting-selected-areas","text":"You can unhighlight by selecting the row, right mouse hold and uncheck the Highlighted checkbox.","title":"Unhighlighting Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhighlighting-all-areas","text":"To remove all highlighted, select Annotation > Clear Highlights . If you are interested in a particular area in the Main Display Pane, but don't know the where in the Attributes Table it is, left double-click on the area of interest and the corresponding row will be highlighted.","title":"Unhighlighting All Areas"},{"location":"cave/d2d-gis-shapefiles/#controlling-visibility-of-image-areas","text":"","title":"Controlling Visibility of Image Areas"},{"location":"cave/d2d-gis-shapefiles/#hiding-selected-areas","text":"To hide a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Visible checkbox. Hidden rows will be gray in the table and the corresponding area in the map display will disappear.","title":"Hiding Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhiding-selected-areas","text":"You can make these images visible by selecting the row, right mouse hold and check the Visible checkbox.","title":"Unhiding Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhiding-all-areas","text":"To make all images visible, select Annotation > Make All Visible .","title":"Unhiding All Areas"},{"location":"cave/d2d-gis-shapefiles/#configuring-attributes-table","text":"In the Attributes Table, you have the option to sort by columns and select which columns are displayed.","title":"Configuring Attributes Table"},{"location":"cave/d2d-gis-shapefiles/#selecting-columns-to-display","text":"By default, all available columns are displayed. The Select Columns dialog will pop-up if you select Data > Select Columns... . You can highlight the columns use the arrows to move them into the Available or Displayed columns. Clicking OK will update your table.","title":"Selecting Columns to Display"},{"location":"cave/d2d-gis-shapefiles/#sorting-column-information","text":"The Sort Order dialog will pop-up if you select Data > Sort... . You can use the drop down menu to choose the column to sort by and then sort by Ascending or Descending. You can sort by additional columns. Clicking OK will update your table.","title":"Sorting Column Information"},{"location":"cave/d2d-gis-shapefiles/#labeling-gis-data","text":"You can select which attribute you want to use to label the objects on the Main Display. To open the Label submenu, right click and hold on the Map Product ID in the Legend area to open a pop-up menu and select Label and choose which attribute you want as the label.","title":"Labeling GIS Data"},{"location":"cave/d2d-gridded-models/","text":"Volume Browser \uf0c1 The Volume Browser provides access to numerical models, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser can be accessed from either the Tools (alphabetically organized) or Models (first option) menus. Visual Overview \uf0c1 The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons (Diff and Load) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser. Volume Browser Menu Bar \uf0c1 The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Loop Types VB Tools \uf0c1 Baselines \uf0c1 Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you are working with a baseline, a second click with B3 will return you to the original baseline, even if you modified another baseline. Points \uf0c1 Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Choose By ID \uf0c1 Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. Display Types \uf0c1 Plan View (default) \uf0c1 This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space. Cross Section \uf0c1 Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\") Time Height \uf0c1 Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution. Var vs Hgt \uf0c1 Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data. Sounding \uf0c1 Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen. Time Series \uf0c1 Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point. Loop Types \uf0c1 Time \uf0c1 The default option for the Volume Browser. It allows you to view model data through time Space \uf0c1 Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.","title":"Volume Browser"},{"location":"cave/d2d-gridded-models/#volume-browser","text":"The Volume Browser provides access to numerical models, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser can be accessed from either the Tools (alphabetically organized) or Models (first option) menus.","title":"Volume Browser"},{"location":"cave/d2d-gridded-models/#visual-overview","text":"The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons (Diff and Load) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser.","title":"Visual Overview"},{"location":"cave/d2d-gridded-models/#volume-browser-menu-bar","text":"The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Loop Types","title":"Volume Browser Menu Bar"},{"location":"cave/d2d-gridded-models/#vb-tools","text":"","title":"VB Tools"},{"location":"cave/d2d-gridded-models/#baselines","text":"Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you are working with a baseline, a second click with B3 will return you to the original baseline, even if you modified another baseline.","title":"Baselines"},{"location":"cave/d2d-gridded-models/#points","text":"Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use.","title":"Points"},{"location":"cave/d2d-gridded-models/#choose-by-id","text":"Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations.","title":"Choose By ID"},{"location":"cave/d2d-gridded-models/#display-types","text":"","title":"Display Types"},{"location":"cave/d2d-gridded-models/#plan-view-default","text":"This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space.","title":"Plan View (default)"},{"location":"cave/d2d-gridded-models/#cross-section","text":"Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\")","title":"Cross Section"},{"location":"cave/d2d-gridded-models/#time-height","text":"Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution.","title":"Time Height"},{"location":"cave/d2d-gridded-models/#var-vs-hgt","text":"Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data.","title":"Var vs Hgt"},{"location":"cave/d2d-gridded-models/#sounding","text":"Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen.","title":"Sounding"},{"location":"cave/d2d-gridded-models/#time-series","text":"Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point.","title":"Time Series"},{"location":"cave/d2d-gridded-models/#loop-types","text":"","title":"Loop Types"},{"location":"cave/d2d-gridded-models/#time","text":"The default option for the Volume Browser. It allows you to view model data through time","title":"Time"},{"location":"cave/d2d-gridded-models/#space","text":"Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.","title":"Space"},{"location":"cave/d2d-grids/","text":"MSLP and Precipitation \uf0c1 Sfc Temperature and Wind \uf0c1 Sfc Dewpoint Temperature \uf0c1 Sfc Relative Humidity \uf0c1 30mb Mean Dewpoint \uf0c1 Precipitable Water \uf0c1 Simulated Reflectivity (REFC) \uf0c1 Lightning Threat \uf0c1 Precip Type / Moisture Transport \uf0c1 Vorticity (500mb) \uf0c1 Vertical Velocity (500mb, 700mb, 850mb) \uf0c1 Thickness / Vorticity Advection (Trenberth) \uf0c1 Wind / Height (850mb, 700mb, 500mb, 300mb, 250mb) \uf0c1 Potential Vorticity (250mb) \uf0c1 Helicity / Storm-Relative Flow \uf0c1 Hail Parameters \uf0c1 MCS Parameters \uf0c1 Isentopic Analysis (270K-320K) \uf0c1","title":"D2d grids"},{"location":"cave/d2d-grids/#mslp-and-precipitation","text":"","title":"MSLP and Precipitation"},{"location":"cave/d2d-grids/#sfc-temperature-and-wind","text":"","title":"Sfc Temperature and Wind"},{"location":"cave/d2d-grids/#sfc-dewpoint-temperature","text":"","title":"Sfc Dewpoint Temperature"},{"location":"cave/d2d-grids/#sfc-relative-humidity","text":"","title":"Sfc Relative Humidity"},{"location":"cave/d2d-grids/#30mb-mean-dewpoint","text":"","title":"30mb Mean Dewpoint"},{"location":"cave/d2d-grids/#precipitable-water","text":"","title":"Precipitable Water"},{"location":"cave/d2d-grids/#simulated-reflectivity-refc","text":"","title":"Simulated Reflectivity (REFC)"},{"location":"cave/d2d-grids/#lightning-threat","text":"","title":"Lightning Threat"},{"location":"cave/d2d-grids/#precip-type-moisture-transport","text":"","title":"Precip Type / Moisture Transport"},{"location":"cave/d2d-grids/#vorticity-500mb","text":"","title":"Vorticity (500mb)"},{"location":"cave/d2d-grids/#vertical-velocity-500mb-700mb-850mb","text":"","title":"Vertical Velocity (500mb, 700mb, 850mb)"},{"location":"cave/d2d-grids/#thickness-vorticity-advection-trenberth","text":"","title":"Thickness / Vorticity Advection (Trenberth)"},{"location":"cave/d2d-grids/#wind-height-850mb-700mb-500mb-300mb-250mb","text":"","title":"Wind / Height (850mb, 700mb, 500mb, 300mb, 250mb)"},{"location":"cave/d2d-grids/#potential-vorticity-250mb","text":"","title":"Potential Vorticity (250mb)"},{"location":"cave/d2d-grids/#helicity-storm-relative-flow","text":"","title":"Helicity / Storm-Relative Flow"},{"location":"cave/d2d-grids/#hail-parameters","text":"","title":"Hail Parameters"},{"location":"cave/d2d-grids/#mcs-parameters","text":"","title":"MCS Parameters"},{"location":"cave/d2d-grids/#isentopic-analysis-270k-320k","text":"","title":"Isentopic Analysis (270K-320K)"},{"location":"cave/d2d-hydro/","text":"The NCEP/Hydro menu contains nine sections: SPC, TPC, NCO, HPC, MPC, CPC, AWC, Hydro, and Local Analyses/Statistical Guidance. Each section is further subdivided into related products, as described below. For more information on hydro products, refer to documentation prepared by the NWS' Office of Hydrology. SPC \uf0c1 Storm Prediction Center (SPC) Watches, Severe Weather Plots, SPC Convective Outlooks, and Fire Weather information. Severe Weather Plots are extracted from the STADTS and STAHRY text products and plotted to time-match the current display. The Severe Weather Plots data set in the NCEP/Hydro Menu can be interrogated (sampled) for more detailed information by clicking mouse Button 1 (B1) over a site. TPC \uf0c1 Contains the hurricane submenu, which comprises graphic products that display the Marine/Tropical Cyclone Advisory (TCM), the Public Tropical Cyclone Advisory (TCP), hourly forecasts, and model guidance. HPC \uf0c1 Contains 6-hour QPF (Quantitative Precipitation Forecast) data plus the submenus, described below, for Precipitation and Temps & Weather products. Precipitation Contains probabilities of daily precipitation, precipitation accumulation, and probabilities of daily snowfall. In addition, this submenu enables you to display QPF projections for 1 to 3 days in 6 hour increments, 4 to 5 days in 48 hour increments, and 1 to 5 days in 120 hour increments. The HPC Excessive Rainfall product consists of a contour graphic and image of the excessive rainfall for day 1 (with forecast times of 21, 24, 27, or 30 hours), and days 2 and 3 (both with forecast times of 48 and 72 hours). The HPC product will update the selected forecast cycle twice per day. Temps & Weather Contains daily Max/Min temperature anomalies, daily heat index probabilities, and pressure and frontal analysis. MPC \uf0c1 Contains the Marine Guidance submenu, which includes marine analyses and model guidance. Note that the Marine Prediction Center (MPC) is now called the Ocean Prediction Center (OPC). CPC \uf0c1 Contains threat charts and outlook grids derived from these two submenus: Threat Charts Contains drought monitoring data, daily threats assessment, and daily heat index forecasts. Outlook Grids Contains temperature and precipitation probabilities. AWC \uf0c1 Contains CCFP (Collaborative Convective Forecast Product), an aviation product. Formerly located under the Aviation option on the Upper Air menu, CCFP is a strategic forecast of convection to guide traffic managers in their system-wide approach to managing traffic. The forecast suite consists of 3 forecast maps with selectable lead times (4, 6, and 8 hours). The forecasts are issued by the Aviation Weather Center (AWC) between March 1 and October 30, eleven times per day. CCFP is alpha-numeric information suitable for the graphical depiction of forecast areas of significant thunderstorms. The CCFP message covers the CONUS area, and includes information on the location of thunderstorm areas, and associated information such as storm tops, coverage, confidence, and direction/speed of movement. NCO \uf0c1 Contains Precip & Stability, Temps & Weather, National Centers model, NGM MOS (NGM-based MOS system), and the following Sounding-derived plots submenus. Precip & Stability : Contains precipitation, radar, and stability products. Temps & Weather : Contains Max/Min temperature, freezing level, weather depiction, and surface geostrophic wind and relative vorticity plots. National Centers Models : Contains model guidance from the National Centers Sounding-derived plots : Contains options to display model soundings (sometimes called \"BUFR soundings\" because they are packaged in BUFR format for transmission). These are soundings extracted directly from the model, including all levels not generated from the pressure-level grids used elsewhere in the system. Sounding Availability This option displays the sounding locations (shown with asterisks) available from the latest model run; typically these locations coincide with TAF (Terminal Aerodrome Forecast) locations. The plot will update with each model run. Because the sounding data is quite voluminous, only soundings over your State(s) scale are saved. Surface The Surface Plots, which mimic the METAR Surface Plots, are taken from the model-derived soundings and provide hourly forecast surface plots. Because you cannot see all forecast projections in a 32 frame loop (e.g., displaying the entire North American Model (NAM) or Global Forecasting System (GFS) run would require 61 frames), you will probably want to use the Time Options Tool (refer to Subsection 2.2.6.4) to view a subset of the forecast -- perhaps a continuous run of hours or every other hour for the whole run. Ceiling/Visibility The \"Ceil/Vis Plot\" shows weather (rain, frz rain, snow) on the right, a stack of three cloud layers above, and visibility below the METAR station. The cloud layers are defined as low (990mb-640mb), mid (640mb-350mb), and high (<350mb). Each cloud layer shows a coverage circle with clear, sct, bkn, and ovc options. Next to one of the circles, there may be a cloud base. The cloud base is sent as a pressure, but is plotted in hft MSL based on a Standard Atmosphere conversion. Because the cloud layers and the cloud base are generated from separate algorithms at NCEP (National Centers for Environmental Prediction), it is possible to have broken or overcast clouds indicated but no base; alternatively, the base may be shown with a high overcast, while ignoring a mid broken layer. Also, a cloud base is reported if convective precipitation is indicated, even for only 10-20% cloud cover. As a result, one can see a cloud base associated with scattered clouds. 1 Hr and 3 Hr Precip Amt This option shows hourly amounts for NAM and 3 hour intervals for GFS at each location. Cloud Layers This option displays the amount of low, middle, and high cloud cover, each as a standard sky coverage symbol, and weather type as a weather symbol. Hydro \uf0c1 Contains QPE, QPF, and RFC Flash Flood Guidance submenus. Hydro Applications, such as HydroView and MPE Editor, are loaded from the Perspectives dialog (Hydro and MPE, respectively) or from the HydroApps menu in the Hydro(View) Perspective (Hydrobase, RiverPro, XDAT, Forecast Service, River Monitor, Precip Monitor, SSHP, and Dam Catalog). QPE : Makes available mosaic images of RFC-generated Quantitative Precipitation Estimator (QPE) and the Multisensor Precipitation Estimator (MPE) grids, which are displayed using a 'truncated' grid color table that shows zero values in gray to let you see the limits of the site-specified domain. These mosaic images are generated by the RFCs in 1, 6, and 24 hour cycles. The MPE grids can be displayed as local contours or images. NESDIS produces two types of Satellite Precipitation Estimates (SPE) based on GOES (Geostationary Operational Environmental Satellite) imagery series: Auto SPEs and Manual SPEs. Auto SPEs, which can be displayed directly from the QPE submenu, are produced hourly based on the most recent one-hour series of IR GOES imagery. This product is displayable on any AWIPS scale. The Auto SPE estimates are displayed in units of inches of precipitation that fell during the specified one hour period. Manual SPEs are accessible through the Manual SPE submenu. You can access the Manual SPE submenu from the QPE submenu. Generation of these products requires substantial manual intervention by NESDIS personnel; consequently, these products are generated and distributed to AWIPS at variable frequencies, as significant precipitation events warrant (i.e., their frequency is variable). The duration (or valid period) of the Manual SPEs is also variable. Whereas the duration of Auto SPEs is always one hour, the duration of the Manual SPEs ranges from 1 to 12 hours. Furthermore, although each Manual SPE product is mapped to a CONUS grid, the area of analysis is usually regional (focusing on the significant precipitation event). Apart from these important differences, the Manual SPEs are very similar to the Auto SPEs. QPF : Displays QPF, which indicate how much precipitation will occur in a particular grid. QPFs, which are issued by the RFCs, display as contours by default. However, from the pop-up menu you can convert them to image form. RFC Flash Flood Guidance : Displays County and Zone Flash Flood Guidance (FFG) grids on any scale. The area for which the data is displayed is limited, but the site system manager may configure a larger area. In addition, 1h, 3h, and 6h mosaic RFC-generated FFG grids can be displayed for both local and other RFC locations. Local Analyses/Statistical Guidance \uf0c1 Model Output Statistical (MOS) plots derived from the MOS BUFR and Text Bulletins display forecast data for GFS MOS, GFS-Extended MOS, Eta MOS, and NGM MOS. The plots are accessed by selecting NGM or GFS-LAMP/MOS forecasts under the Local Analyses/Statistical Guidance option.","title":"D2d hydro"},{"location":"cave/d2d-hydro/#spc","text":"Storm Prediction Center (SPC) Watches, Severe Weather Plots, SPC Convective Outlooks, and Fire Weather information. Severe Weather Plots are extracted from the STADTS and STAHRY text products and plotted to time-match the current display. The Severe Weather Plots data set in the NCEP/Hydro Menu can be interrogated (sampled) for more detailed information by clicking mouse Button 1 (B1) over a site.","title":"SPC"},{"location":"cave/d2d-hydro/#tpc","text":"Contains the hurricane submenu, which comprises graphic products that display the Marine/Tropical Cyclone Advisory (TCM), the Public Tropical Cyclone Advisory (TCP), hourly forecasts, and model guidance.","title":"TPC"},{"location":"cave/d2d-hydro/#hpc","text":"Contains 6-hour QPF (Quantitative Precipitation Forecast) data plus the submenus, described below, for Precipitation and Temps & Weather products. Precipitation Contains probabilities of daily precipitation, precipitation accumulation, and probabilities of daily snowfall. In addition, this submenu enables you to display QPF projections for 1 to 3 days in 6 hour increments, 4 to 5 days in 48 hour increments, and 1 to 5 days in 120 hour increments. The HPC Excessive Rainfall product consists of a contour graphic and image of the excessive rainfall for day 1 (with forecast times of 21, 24, 27, or 30 hours), and days 2 and 3 (both with forecast times of 48 and 72 hours). The HPC product will update the selected forecast cycle twice per day. Temps & Weather Contains daily Max/Min temperature anomalies, daily heat index probabilities, and pressure and frontal analysis.","title":"HPC"},{"location":"cave/d2d-hydro/#mpc","text":"Contains the Marine Guidance submenu, which includes marine analyses and model guidance. Note that the Marine Prediction Center (MPC) is now called the Ocean Prediction Center (OPC).","title":"MPC"},{"location":"cave/d2d-hydro/#cpc","text":"Contains threat charts and outlook grids derived from these two submenus: Threat Charts Contains drought monitoring data, daily threats assessment, and daily heat index forecasts. Outlook Grids Contains temperature and precipitation probabilities.","title":"CPC"},{"location":"cave/d2d-hydro/#awc","text":"Contains CCFP (Collaborative Convective Forecast Product), an aviation product. Formerly located under the Aviation option on the Upper Air menu, CCFP is a strategic forecast of convection to guide traffic managers in their system-wide approach to managing traffic. The forecast suite consists of 3 forecast maps with selectable lead times (4, 6, and 8 hours). The forecasts are issued by the Aviation Weather Center (AWC) between March 1 and October 30, eleven times per day. CCFP is alpha-numeric information suitable for the graphical depiction of forecast areas of significant thunderstorms. The CCFP message covers the CONUS area, and includes information on the location of thunderstorm areas, and associated information such as storm tops, coverage, confidence, and direction/speed of movement.","title":"AWC"},{"location":"cave/d2d-hydro/#nco","text":"Contains Precip & Stability, Temps & Weather, National Centers model, NGM MOS (NGM-based MOS system), and the following Sounding-derived plots submenus. Precip & Stability : Contains precipitation, radar, and stability products. Temps & Weather : Contains Max/Min temperature, freezing level, weather depiction, and surface geostrophic wind and relative vorticity plots. National Centers Models : Contains model guidance from the National Centers Sounding-derived plots : Contains options to display model soundings (sometimes called \"BUFR soundings\" because they are packaged in BUFR format for transmission). These are soundings extracted directly from the model, including all levels not generated from the pressure-level grids used elsewhere in the system. Sounding Availability This option displays the sounding locations (shown with asterisks) available from the latest model run; typically these locations coincide with TAF (Terminal Aerodrome Forecast) locations. The plot will update with each model run. Because the sounding data is quite voluminous, only soundings over your State(s) scale are saved. Surface The Surface Plots, which mimic the METAR Surface Plots, are taken from the model-derived soundings and provide hourly forecast surface plots. Because you cannot see all forecast projections in a 32 frame loop (e.g., displaying the entire North American Model (NAM) or Global Forecasting System (GFS) run would require 61 frames), you will probably want to use the Time Options Tool (refer to Subsection 2.2.6.4) to view a subset of the forecast -- perhaps a continuous run of hours or every other hour for the whole run. Ceiling/Visibility The \"Ceil/Vis Plot\" shows weather (rain, frz rain, snow) on the right, a stack of three cloud layers above, and visibility below the METAR station. The cloud layers are defined as low (990mb-640mb), mid (640mb-350mb), and high (<350mb). Each cloud layer shows a coverage circle with clear, sct, bkn, and ovc options. Next to one of the circles, there may be a cloud base. The cloud base is sent as a pressure, but is plotted in hft MSL based on a Standard Atmosphere conversion. Because the cloud layers and the cloud base are generated from separate algorithms at NCEP (National Centers for Environmental Prediction), it is possible to have broken or overcast clouds indicated but no base; alternatively, the base may be shown with a high overcast, while ignoring a mid broken layer. Also, a cloud base is reported if convective precipitation is indicated, even for only 10-20% cloud cover. As a result, one can see a cloud base associated with scattered clouds. 1 Hr and 3 Hr Precip Amt This option shows hourly amounts for NAM and 3 hour intervals for GFS at each location. Cloud Layers This option displays the amount of low, middle, and high cloud cover, each as a standard sky coverage symbol, and weather type as a weather symbol.","title":"NCO"},{"location":"cave/d2d-hydro/#hydro","text":"Contains QPE, QPF, and RFC Flash Flood Guidance submenus. Hydro Applications, such as HydroView and MPE Editor, are loaded from the Perspectives dialog (Hydro and MPE, respectively) or from the HydroApps menu in the Hydro(View) Perspective (Hydrobase, RiverPro, XDAT, Forecast Service, River Monitor, Precip Monitor, SSHP, and Dam Catalog). QPE : Makes available mosaic images of RFC-generated Quantitative Precipitation Estimator (QPE) and the Multisensor Precipitation Estimator (MPE) grids, which are displayed using a 'truncated' grid color table that shows zero values in gray to let you see the limits of the site-specified domain. These mosaic images are generated by the RFCs in 1, 6, and 24 hour cycles. The MPE grids can be displayed as local contours or images. NESDIS produces two types of Satellite Precipitation Estimates (SPE) based on GOES (Geostationary Operational Environmental Satellite) imagery series: Auto SPEs and Manual SPEs. Auto SPEs, which can be displayed directly from the QPE submenu, are produced hourly based on the most recent one-hour series of IR GOES imagery. This product is displayable on any AWIPS scale. The Auto SPE estimates are displayed in units of inches of precipitation that fell during the specified one hour period. Manual SPEs are accessible through the Manual SPE submenu. You can access the Manual SPE submenu from the QPE submenu. Generation of these products requires substantial manual intervention by NESDIS personnel; consequently, these products are generated and distributed to AWIPS at variable frequencies, as significant precipitation events warrant (i.e., their frequency is variable). The duration (or valid period) of the Manual SPEs is also variable. Whereas the duration of Auto SPEs is always one hour, the duration of the Manual SPEs ranges from 1 to 12 hours. Furthermore, although each Manual SPE product is mapped to a CONUS grid, the area of analysis is usually regional (focusing on the significant precipitation event). Apart from these important differences, the Manual SPEs are very similar to the Auto SPEs. QPF : Displays QPF, which indicate how much precipitation will occur in a particular grid. QPFs, which are issued by the RFCs, display as contours by default. However, from the pop-up menu you can convert them to image form. RFC Flash Flood Guidance : Displays County and Zone Flash Flood Guidance (FFG) grids on any scale. The area for which the data is displayed is limited, but the site system manager may configure a larger area. In addition, 1h, 3h, and 6h mosaic RFC-generated FFG grids can be displayed for both local and other RFC locations.","title":"Hydro"},{"location":"cave/d2d-hydro/#local-analysesstatistical-guidance","text":"Model Output Statistical (MOS) plots derived from the MOS BUFR and Text Bulletins display forecast data for GFS MOS, GFS-Extended MOS, Eta MOS, and NGM MOS. The plots are accessed by selecting NGM or GFS-LAMP/MOS forecasts under the Local Analyses/Statistical Guidance option.","title":"Local Analyses/Statistical Guidance"},{"location":"cave/d2d-map-resources/","text":"These programs are accessible though the Maps dropdown menu. Interstates Interstates and US Highways Warning Areas (with station identifier) WSR-88D Station Locations","title":"D2d map resources"},{"location":"cave/d2d-perspective/","text":"D2D Perspective \uf0c1 D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ). Resource Stack \uf0c1 At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products. Left-Click Resource Name to Hide \uf0c1 A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible. Right-Click Background to Cycle Resource Views \uf0c1 The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources. Hold-Right-Click Resource Name for Menu \uf0c1 Drag the mouse over a loaded resource and hold the right mouse button until a menu appears. The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. This menu also gives you the option to unload this specific product , as opposed to removing all data prodcuts. Simply select the Unload option at the bottom of the resource's hold-right-click menu. Display Menu \uf0c1 The display menu has many options which can alter the functionality in CAVE. Hold-Right-Click Background for Display Menu \uf0c1 Holding down the right mouse button anywhere in the map view will open a right-click menu Show Map Legends \uf0c1 From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view. Sample Loaded Resources \uf0c1 Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu. Toggle 2 or 4-Panel Layout \uf0c1 Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view. Notice the readout is at the same position in both panels. Any mouse movement made on one panel will be made on the other. By default, loading any data will load that data onto both panels. However, there is the option to specify which panel you would like to load data into, which can be useful if you want to have different data in each of the panels. To access this option, simple hold-right click to pull up the Display menu and choose Load to This Panel as shown below: Now, a yellow L will appear in the lower left hand corner of the panel you selected to load data to. When data is loaded from the menus it will only load to the display desginated with the L. Switch back to loading in both panels, by using the Load to All Panels option in the Display Menu. From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four). Unload Data \uf0c1 Select Unload All Products to remove all loaded graphic and image products from the display and start fresh. Select Unload Graphics to remove all but the image products. Product Browser \uf0c1 The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Browsers > Product Browser . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available. Options Menu \uf0c1 There are several toggle options and options dialogs that are available under the Options menu found at the top of the application. Time Options (Ctrl + T) \uf0c1 This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select Offset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will be displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog box appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how far off it is. Image Combination (Insert) \uf0c1 This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable. Display Properties \uf0c1 This menu option opens the Display Properties dialog box. Most of the options available in this dialog box are also available on the Toolbar , while the rest are available in the individual resource menus if that resource uses these properties. Loop Properties (Ctrl + L) \uf0c1 Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping. Image Properties (Ctrl + I) \uf0c1 The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data. Set Time \uf0c1 This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data. Set Background Color \uf0c1 You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3). Switching Perspectives \uf0c1 Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. Note : The National Centers Perspective (which is available in the Other... submenu) is available on the Linux version of CAVE. And the GFE perspective is not available on the Windows version. CAVE Preferences \uf0c1 Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname. Load Mode \uf0c1 Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of frames. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display pane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already loaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you want. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.","title":"D2D Perspective"},{"location":"cave/d2d-perspective/#d2d-perspective","text":"D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ).","title":"D2D Perspective"},{"location":"cave/d2d-perspective/#resource-stack","text":"At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products.","title":"Resource Stack"},{"location":"cave/d2d-perspective/#left-click-resource-name-to-hide","text":"A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.","title":"Left-Click Resource Name to Hide"},{"location":"cave/d2d-perspective/#right-click-background-to-cycle-resource-views","text":"The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources.","title":"Right-Click Background to Cycle Resource Views"},{"location":"cave/d2d-perspective/#hold-right-click-resource-name-for-menu","text":"Drag the mouse over a loaded resource and hold the right mouse button until a menu appears. The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. This menu also gives you the option to unload this specific product , as opposed to removing all data prodcuts. Simply select the Unload option at the bottom of the resource's hold-right-click menu.","title":"Hold-Right-Click Resource Name for Menu"},{"location":"cave/d2d-perspective/#display-menu","text":"The display menu has many options which can alter the functionality in CAVE.","title":"Display Menu"},{"location":"cave/d2d-perspective/#hold-right-click-background-for-display-menu","text":"Holding down the right mouse button anywhere in the map view will open a right-click menu","title":"Hold-Right-Click Background for Display Menu"},{"location":"cave/d2d-perspective/#show-map-legends","text":"From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view.","title":"Show Map Legends"},{"location":"cave/d2d-perspective/#sample-loaded-resources","text":"Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu.","title":"Sample Loaded Resources"},{"location":"cave/d2d-perspective/#toggle-2-or-4-panel-layout","text":"Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view. Notice the readout is at the same position in both panels. Any mouse movement made on one panel will be made on the other. By default, loading any data will load that data onto both panels. However, there is the option to specify which panel you would like to load data into, which can be useful if you want to have different data in each of the panels. To access this option, simple hold-right click to pull up the Display menu and choose Load to This Panel as shown below: Now, a yellow L will appear in the lower left hand corner of the panel you selected to load data to. When data is loaded from the menus it will only load to the display desginated with the L. Switch back to loading in both panels, by using the Load to All Panels option in the Display Menu. From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four).","title":"Toggle 2 or 4-Panel Layout"},{"location":"cave/d2d-perspective/#unload-data","text":"Select Unload All Products to remove all loaded graphic and image products from the display and start fresh. Select Unload Graphics to remove all but the image products.","title":"Unload Data"},{"location":"cave/d2d-perspective/#product-browser","text":"The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Browsers > Product Browser . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available.","title":"Product Browser"},{"location":"cave/d2d-perspective/#options-menu","text":"There are several toggle options and options dialogs that are available under the Options menu found at the top of the application.","title":"Options Menu"},{"location":"cave/d2d-perspective/#time-options-ctrl-t","text":"This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select Offset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will be displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog box appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how far off it is.","title":"Time Options (Ctrl + T)"},{"location":"cave/d2d-perspective/#image-combination-insert","text":"This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable.","title":"Image Combination (Insert)"},{"location":"cave/d2d-perspective/#display-properties","text":"This menu option opens the Display Properties dialog box. Most of the options available in this dialog box are also available on the Toolbar , while the rest are available in the individual resource menus if that resource uses these properties.","title":"Display Properties"},{"location":"cave/d2d-perspective/#loop-properties-ctrl-l","text":"Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping.","title":"Loop Properties (Ctrl + L)"},{"location":"cave/d2d-perspective/#image-properties-ctrl-i","text":"The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data.","title":"Image Properties (Ctrl + I)"},{"location":"cave/d2d-perspective/#set-time","text":"This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data.","title":"Set Time"},{"location":"cave/d2d-perspective/#set-background-color","text":"You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3).","title":"Set Background Color"},{"location":"cave/d2d-perspective/#switching-perspectives","text":"Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. Note : The National Centers Perspective (which is available in the Other... submenu) is available on the Linux version of CAVE. And the GFE perspective is not available on the Windows version.","title":"Switching Perspectives"},{"location":"cave/d2d-perspective/#cave-preferences","text":"Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname.","title":"CAVE Preferences"},{"location":"cave/d2d-perspective/#load-mode","text":"Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of frames. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display pane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already loaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you want. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.","title":"Load Mode"},{"location":"cave/d2d-pointdata-surface-obs/","text":"Several of the data sets in the Obs menu can be interrogated (sampled) for more detailed information by clicking mouse Button 1 (B1) over a site. These data sets include METAR, Maritime, and Local. The Obs menu is subdivided into sections that contain related products. These sections are described below. METAR \uf0c1 This section contains automatically updating METAR observations, ceiling and visibility plots, wind chill and heat indices, precipitation plots at various time intervals, and quality-checked MSAS observations. The 24hr Chg METAR plot provides the difference between the observed temperature, dewpoint, pressure, and wind from those observed 24 hours earlier. The calculation of the wind difference involves vector subtraction of the \"u\" and \"v\" components. Synoptic \uf0c1 This section contains automatically updating Synoptic observations, and 6 hour and 24 hour precip plots. Note that this section of the menu is not present at most sites. Maritime \uf0c1 This section contains buoy and ship report plots, plus SAFESEAS for the Marine WFOs. MAROB displays include Station Plots The Other Maritime Plots cascading menu contains options to display the Fixed and Moving Sea State plots, MAROB Sea State and Cloud/Vis plots, Maritime Clouds/Visibility plots, as well as the Scatterometer Winds. Sea State plots provide information on the wave period and height and swell period and height. The wave type, whether a standard wave or a wind wave, is denoted at the origin of the plot by a \"+\" or a \"w\", respectively. An \"x\" at the plot origin signifies that no wave type was reported. If reported, the directions of the primary and secondary swells are denoted with arrows labeled \"1\" and \"2\", respectively. The arrows point in the direction the swell is moving. Maritime Clouds/Visibility plots contain a station circle denoting sky coverage and the visibility along with standard symbols for obstructions to visibility. Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOp-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT Winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) can be launched from either the CAVE Obs menu or from the Satellite menu You can access the Scatterometer Winds menu options by selecting Surface > Other Maritime Plots > Scatterometer Winds . The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales: N. Hemisphere, North America, CONUS, Regional, State(s), and WFO. Local Storm Reports : Local Storm Report (LSR) plots are generated from spotter reports that were entered into the LSR text database and decoded into the correct point data format. The LSR graphical user interface (GUI) is a stand-alone AWIPS application designed to provide forecasters with an easy and quick way to create, manage, and send the LSR public text product. This text product contains noteworthy weather events for which the forecaster has either received or sought out real-time observations. National Convective Weather Forecast (AWC) \uf0c1 The National Convective Weather Forecast (NCWF) is an automatically generated depiction of current convection and extrapolated significant current convection. It is a supplement to, but does not substitute for, the report and forecast information contained in Convective SIGMETs . The NCWF contains both GRIB and BUFR output. The GRIB output delineates the current convection. The BUFR output includes hazardous convection area polygons, movement arrows, and storm top and speed text information. The NCWF display bunlde renders storm tops and movement , previous performance polygons , 1-hour extrapolation polygons , and current convective interest grid (colorbar). Center Weather Advisories (CWA) \uf0c1 The CWA is an aviation weather warning for conditions meeting or approaching national in-flight advisory (AIRMET, SIGMET or SIGMET for convection) criteria. The CWA is primarily used by air crews to anticipate and avoid adverse weather conditions in the en route and terminal environments. It is not a flight planning product because of its short lead time and duration. Shown with NEXRAD DHR composite: MOS Products \uf0c1 These plots are derived from the MOS BUFR Bulletins. The previous MOS plots were derived from the MOS Text Bulletins. The plots display forecast data for GFS MOS, GFS-Extended MOS, and NGM MOS. Submenus under each model reveal the element choices. These displays include: Station Model Plots (Wind, T, Td, Sky Cover, Wx) MaxT/MinT (\u00b0F) Ceiling (agl) / Visibility (ft \u00d7 100) (Categorical) Probabilities Submenu (6h/12h PoP, 6h/12h Tstorm, 6h/12h Svr-Tstorm, Conditional precipitation types; %) QPF 12h (Categorical mid-points; inches) QPF 6h (Categorical mid-points; inches) Snowfall (6h/12h/24h, Categorical; inches) Lightning \uf0c1 This menu item provides three options for displaying lightning flash plots over specified 1 minute, 5 minute, 15 minute and 1 hour intervals. USPLN (United States Precision Lightning Network) : WSI Corporation USPLN lightning data has been made available exclusively to universities for education and research use. Unidata serves USPLN lightning stroke data from the LIGHTNING LDM data feed. Registration is required to request this data, and the free feed is available on an annually renewed basis. USPLN data is not available to the public. NLDN (National Lightning Detection Network) : The NLDN option plots cloud-to-ground (CG) lightning flashes for specified time intervals across the continental United States. NLDN lightning data can be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). GLD (Global Lightning Dataset) : The GLD option plots cloud-to-ground (CG) lightning flashes for specified time intervals on a global-scale. GLD lightning data can also be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). ENI Total Lightning : In addition to displaying CG lightning flashes, the Total Lightning option also displays Cloud Flash (CF) lightning and Pulses. CF lightning are lightning flashes which do not strike the ground such as in-cloud, cloud-to-cloud, and cloud-to-air lightning. Lightning pulses are electromagnetic pulses that radiate outward from the lightning channel. ENI total lightning data can be displayed as a grid image displaying the cloud-to-ground, cloud flash, and lightning pulse density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km).","title":"D2d pointdata surface obs"},{"location":"cave/d2d-pointdata-surface-obs/#metar","text":"This section contains automatically updating METAR observations, ceiling and visibility plots, wind chill and heat indices, precipitation plots at various time intervals, and quality-checked MSAS observations. The 24hr Chg METAR plot provides the difference between the observed temperature, dewpoint, pressure, and wind from those observed 24 hours earlier. The calculation of the wind difference involves vector subtraction of the \"u\" and \"v\" components.","title":"METAR"},{"location":"cave/d2d-pointdata-surface-obs/#synoptic","text":"This section contains automatically updating Synoptic observations, and 6 hour and 24 hour precip plots. Note that this section of the menu is not present at most sites.","title":"Synoptic"},{"location":"cave/d2d-pointdata-surface-obs/#maritime","text":"This section contains buoy and ship report plots, plus SAFESEAS for the Marine WFOs. MAROB displays include Station Plots The Other Maritime Plots cascading menu contains options to display the Fixed and Moving Sea State plots, MAROB Sea State and Cloud/Vis plots, Maritime Clouds/Visibility plots, as well as the Scatterometer Winds. Sea State plots provide information on the wave period and height and swell period and height. The wave type, whether a standard wave or a wind wave, is denoted at the origin of the plot by a \"+\" or a \"w\", respectively. An \"x\" at the plot origin signifies that no wave type was reported. If reported, the directions of the primary and secondary swells are denoted with arrows labeled \"1\" and \"2\", respectively. The arrows point in the direction the swell is moving. Maritime Clouds/Visibility plots contain a station circle denoting sky coverage and the visibility along with standard symbols for obstructions to visibility. Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOp-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT Winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) can be launched from either the CAVE Obs menu or from the Satellite menu You can access the Scatterometer Winds menu options by selecting Surface > Other Maritime Plots > Scatterometer Winds . The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales: N. Hemisphere, North America, CONUS, Regional, State(s), and WFO. Local Storm Reports : Local Storm Report (LSR) plots are generated from spotter reports that were entered into the LSR text database and decoded into the correct point data format. The LSR graphical user interface (GUI) is a stand-alone AWIPS application designed to provide forecasters with an easy and quick way to create, manage, and send the LSR public text product. This text product contains noteworthy weather events for which the forecaster has either received or sought out real-time observations.","title":"Maritime"},{"location":"cave/d2d-pointdata-surface-obs/#national-convective-weather-forecast-awc","text":"The National Convective Weather Forecast (NCWF) is an automatically generated depiction of current convection and extrapolated significant current convection. It is a supplement to, but does not substitute for, the report and forecast information contained in Convective SIGMETs . The NCWF contains both GRIB and BUFR output. The GRIB output delineates the current convection. The BUFR output includes hazardous convection area polygons, movement arrows, and storm top and speed text information. The NCWF display bunlde renders storm tops and movement , previous performance polygons , 1-hour extrapolation polygons , and current convective interest grid (colorbar).","title":"National Convective Weather Forecast (AWC)"},{"location":"cave/d2d-pointdata-surface-obs/#center-weather-advisories-cwa","text":"The CWA is an aviation weather warning for conditions meeting or approaching national in-flight advisory (AIRMET, SIGMET or SIGMET for convection) criteria. The CWA is primarily used by air crews to anticipate and avoid adverse weather conditions in the en route and terminal environments. It is not a flight planning product because of its short lead time and duration. Shown with NEXRAD DHR composite:","title":"Center Weather Advisories (CWA)"},{"location":"cave/d2d-pointdata-surface-obs/#mos-products","text":"These plots are derived from the MOS BUFR Bulletins. The previous MOS plots were derived from the MOS Text Bulletins. The plots display forecast data for GFS MOS, GFS-Extended MOS, and NGM MOS. Submenus under each model reveal the element choices. These displays include: Station Model Plots (Wind, T, Td, Sky Cover, Wx) MaxT/MinT (\u00b0F) Ceiling (agl) / Visibility (ft \u00d7 100) (Categorical) Probabilities Submenu (6h/12h PoP, 6h/12h Tstorm, 6h/12h Svr-Tstorm, Conditional precipitation types; %) QPF 12h (Categorical mid-points; inches) QPF 6h (Categorical mid-points; inches) Snowfall (6h/12h/24h, Categorical; inches)","title":"MOS Products"},{"location":"cave/d2d-pointdata-surface-obs/#lightning","text":"This menu item provides three options for displaying lightning flash plots over specified 1 minute, 5 minute, 15 minute and 1 hour intervals. USPLN (United States Precision Lightning Network) : WSI Corporation USPLN lightning data has been made available exclusively to universities for education and research use. Unidata serves USPLN lightning stroke data from the LIGHTNING LDM data feed. Registration is required to request this data, and the free feed is available on an annually renewed basis. USPLN data is not available to the public. NLDN (National Lightning Detection Network) : The NLDN option plots cloud-to-ground (CG) lightning flashes for specified time intervals across the continental United States. NLDN lightning data can be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). GLD (Global Lightning Dataset) : The GLD option plots cloud-to-ground (CG) lightning flashes for specified time intervals on a global-scale. GLD lightning data can also be displayed as a grid image displaying the cloud-to-ground density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km). ENI Total Lightning : In addition to displaying CG lightning flashes, the Total Lightning option also displays Cloud Flash (CF) lightning and Pulses. CF lightning are lightning flashes which do not strike the ground such as in-cloud, cloud-to-cloud, and cloud-to-air lightning. Lightning pulses are electromagnetic pulses that radiate outward from the lightning channel. ENI total lightning data can be displayed as a grid image displaying the cloud-to-ground, cloud flash, and lightning pulse density values for a selected resolution (1km, 3km, 5km, 8km, 20km, and 40km).","title":"Lightning"},{"location":"cave/d2d-radar-tools/","text":"Radar Tools \uf0c1 The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus. Estimated Actual Velocity (EAV) \uf0c1 A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed. Radar Display Controls \uf0c1 The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options: STI (Storm Track Information) \uf0c1 This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed. HI (Hail Index) \uf0c1 This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. TVS (Tornado Vortex Signature) \uf0c1 There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product. DMD, MD, TVS \uf0c1 There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS. DMD (Digital Mesocyclone Display) \uf0c1 Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks. MBA (Microburst Alert) \uf0c1 Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts. SRM (Storm Relative Motion) \uf0c1 The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion. SAILS (Supplemental Adaptive Intra-Volume Low Level Scan) \uf0c1 Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt. VR - Shear \uf0c1 This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"Radar Tools"},{"location":"cave/d2d-radar-tools/#radar-tools","text":"The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus.","title":"Radar Tools"},{"location":"cave/d2d-radar-tools/#estimated-actual-velocity-eav","text":"A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.","title":"Estimated Actual Velocity (EAV)"},{"location":"cave/d2d-radar-tools/#radar-display-controls","text":"The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options:","title":"Radar Display Controls"},{"location":"cave/d2d-radar-tools/#sti-storm-track-information","text":"This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed.","title":"STI (Storm Track Information)"},{"location":"cave/d2d-radar-tools/#hi-hail-index","text":"This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50.","title":"HI (Hail Index)"},{"location":"cave/d2d-radar-tools/#tvs-tornado-vortex-signature","text":"There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product.","title":"TVS (Tornado Vortex Signature)"},{"location":"cave/d2d-radar-tools/#dmd-md-tvs","text":"There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS.","title":"DMD, MD, TVS"},{"location":"cave/d2d-radar-tools/#dmd-digital-mesocyclone-display","text":"Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks.","title":"DMD (Digital Mesocyclone Display)"},{"location":"cave/d2d-radar-tools/#mba-microburst-alert","text":"Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts.","title":"MBA (Microburst Alert)"},{"location":"cave/d2d-radar-tools/#srm-storm-relative-motion","text":"The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion.","title":"SRM (Storm Relative Motion)"},{"location":"cave/d2d-radar-tools/#sails-supplemental-adaptive-intra-volume-low-level-scan","text":"Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt.","title":"SAILS (Supplemental Adaptive Intra-Volume Low Level Scan)"},{"location":"cave/d2d-radar-tools/#vr-shear","text":"This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"VR - Shear"},{"location":"cave/d2d-radar/","text":"NEXRAD Radar Display \uf0c1 The Unidata D2D Perspective features a selectable NEXRAD station display over a loop of the FNEXRAD Digital Hybrid Reflectivity product. Selecting any station will open a two-panel reflectivity and velocity view for the selected station. NEXRAD & TDWR Station Menus \uf0c1 Individual NEXRAD station menus are accessible in Radar > NEXRAD Stations and are grouped alphabetically for a condensed submenu structure. With only the NEXRAD3 feedtype (NEXAD2 being disabled), notice that only some of the menu items will out with available data. Best Res Z+SRM8 / Z+V \uf0c1 The radar combination products Z+SRM and Z+V are precombined formats of the reflectivity and storm relative motion or velocity, displayed together via a single menu selection. SRM products include the storm motion vector information, which is plotted in the upper left corner of the Main Display Pane. 4-panel Z+SRM, ZDR+V, KDP+HC, CC+SW \uf0c1 4-panel Z, ZDR, HC+KDP, CC \uf0c1 This section enables you to load multiple base and dual-pol products, which are then simultaneously displayed. The label of this section of the menu describes the format for loading the products: Z+SRM in the upper left quadrant, ZDR+V in the upper right quadrant, KDP+HC in the lower left quadrant, and CC+SW in the lower right quadrant. Primary dual-pol base data analysis is best accomplished using the All Tilts base data option (4 panel all tilts with 8 products loaded), though you may use the single tilts (e.g., 0.5 base data) for longer time duration loops. To load 4 panel displays containing multiple elevation angles of the same product, you would select the four panel option and then select the desired set of 4 panels from the four panel submenu. All Tilts allows you to step or animate in either space or time. Selecting one of the All Tilts buttons will load all the tilts available from the latest volume scan. It will continue to load tilts from previous volume scans until it has loaded as many frames as indicated on the frame count menu. Auto updates will add higher tilts from the latest volume scan, replacing a tilt from the oldest volume. After loading an All Tilts display, Shift + LEFT ARROW and Shift + RIGHT ARROW and looping will take you through the frames in the order in which the system loaded them (without regard to volume scan or tilt). The UP ARROW and DOWN ARROW will step the display up or down in a volume scan allowing the tilts to change for a fixed time. The RIGHT ARROW and LEFT ARROW will step the display forward or backward through time at a fixed tilt. Once you have set the mode of motion (vertical or time), the Page Up/Page Down keys will start and adjust loop speed. To switch from vertical to time mode or from time to vertical mode, press the desired arrow key. If you hit the up or down arrow key in a standard (not All-Tilts) display, looping and stepping are disabled until you hit either the left or right arrow key or one of the stepping buttons on the menu. Once an arrow key (Left, Right, Up, Down) has been pressed, the stepping/animation controls on the main window toolbar and the Page Up/Page Down keys will function in that same mode. For example, assume the UP ARROW or DOWN ARROW key is pressed; the menu controls will now operate through the tilts at a fixed time, e.g., you can go to the lowest tilt by selecting the First Frame iconified button. Best Res Base Products \uf0c1 This section is divided into two parts. The upper part lists individual products: four base products and three dual-pol products (ZDR, CC, and KDP). The lower part includes submenus for accessing multiple products and applications. The following describes the submenus grouped in the lower part of the Best Res Base Products section. Precip : In addition to the QPE dual-pol products, this submenu includes the legacy precip products, which include Storm Total, One Hour, Three Hour, and User Selectable precipitation products. A suite of snowfall products is also available on the Precip submenu. All are available for request (OTR, RMR), and the first four can be added to an RPS (Routine Product Set) list. All of these products are available on any scale. Derived Products : The Derived Products submenu includes Layer Reflectivity, Cross Section, and Other products displayed on any scale. Derived products include precipitation, storm (mesocyclone, hail, tornado), and wind derivations. Algorithm Overlays : The Algorithm Overlays submenu includes legacy algorithm overlays and the ML dual-pol overlay. four panel : The four panel submenu includes menu entries for Z+V, Z+SRM 8- and 4-bit, and some other combinations that are presented in 4-panel mode, with a different elevation angle or product in each panel. Data Quality : The Data Quality products, accessible by a pull-right submenu, include Clutter Filter Control and reflectivity and velocity clutter probability products. 4-bit/Legacy Prods : The 4-bit/Legacy Prods submenu uses generic selectors that load 8-bit (256 level) data, with legacy 4-bit (16 level) and 3-bit (8 level) data filling in when no 8-bit data is available. Radar Applications : The Radar Applications submenu provides access to all the radar applications and radar tools. MRMS \uf0c1 FNEXRAD Composites \uf0c1 DHR \uf0c1 DLV \uf0c1 EET \uf0c1 HHC \uf0c1 DAA \uf0c1 DTA \uf0c1 Mosaic Radar Plots \uf0c1 Mosaics available via this menu use data from up to nine nearby radars. Additional optional mosaics on cascading menus provide a limited list of radar products from a predefined set of WSR-88D radars within a given region. Your System Manager or site Administrator can set up such mosaics by: /awips2/edex/data/utility/common_static/site//radar/radarInUse.txt . A mosaicInfo.txt table will only work while logged on to an AWIPS workstation. N0Q \uf0c1 DSP \uf0c1 DTA \uf0c1 DAA \uf0c1 Radar Applications \uf0c1 Estimated Actual Velocity (EAV) \uf0c1 A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed. Four-dimensional Stormcell Investigator (FSI) \uf0c1 The Four-dimensional Stormcell Investigator (FSI) was developed by the National Severe Storms Laboratory for its Warning Decision Support System Integrated Information. This technology allows users to create and manipulate dynamic cross-sections (both vertical and at constant altitude), such that one can \u201cslice and dice\u201d storms and view these data in three-dimensions and across time. V-R Shear \uf0c1 This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click B3 to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click B2 on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"NEXRAD Radar Display"},{"location":"cave/d2d-radar/#nexrad-radar-display","text":"The Unidata D2D Perspective features a selectable NEXRAD station display over a loop of the FNEXRAD Digital Hybrid Reflectivity product. Selecting any station will open a two-panel reflectivity and velocity view for the selected station.","title":"NEXRAD Radar Display"},{"location":"cave/d2d-radar/#nexrad-tdwr-station-menus","text":"Individual NEXRAD station menus are accessible in Radar > NEXRAD Stations and are grouped alphabetically for a condensed submenu structure. With only the NEXRAD3 feedtype (NEXAD2 being disabled), notice that only some of the menu items will out with available data.","title":"NEXRAD & TDWR Station Menus"},{"location":"cave/d2d-radar/#best-res-zsrm8-zv","text":"The radar combination products Z+SRM and Z+V are precombined formats of the reflectivity and storm relative motion or velocity, displayed together via a single menu selection. SRM products include the storm motion vector information, which is plotted in the upper left corner of the Main Display Pane.","title":"Best Res Z+SRM8 / Z+V"},{"location":"cave/d2d-radar/#4-panel-zsrm-zdrv-kdphc-ccsw","text":"","title":"4-panel Z+SRM, ZDR+V, KDP+HC, CC+SW"},{"location":"cave/d2d-radar/#4-panel-z-zdr-hckdp-cc","text":"This section enables you to load multiple base and dual-pol products, which are then simultaneously displayed. The label of this section of the menu describes the format for loading the products: Z+SRM in the upper left quadrant, ZDR+V in the upper right quadrant, KDP+HC in the lower left quadrant, and CC+SW in the lower right quadrant. Primary dual-pol base data analysis is best accomplished using the All Tilts base data option (4 panel all tilts with 8 products loaded), though you may use the single tilts (e.g., 0.5 base data) for longer time duration loops. To load 4 panel displays containing multiple elevation angles of the same product, you would select the four panel option and then select the desired set of 4 panels from the four panel submenu. All Tilts allows you to step or animate in either space or time. Selecting one of the All Tilts buttons will load all the tilts available from the latest volume scan. It will continue to load tilts from previous volume scans until it has loaded as many frames as indicated on the frame count menu. Auto updates will add higher tilts from the latest volume scan, replacing a tilt from the oldest volume. After loading an All Tilts display, Shift + LEFT ARROW and Shift + RIGHT ARROW and looping will take you through the frames in the order in which the system loaded them (without regard to volume scan or tilt). The UP ARROW and DOWN ARROW will step the display up or down in a volume scan allowing the tilts to change for a fixed time. The RIGHT ARROW and LEFT ARROW will step the display forward or backward through time at a fixed tilt. Once you have set the mode of motion (vertical or time), the Page Up/Page Down keys will start and adjust loop speed. To switch from vertical to time mode or from time to vertical mode, press the desired arrow key. If you hit the up or down arrow key in a standard (not All-Tilts) display, looping and stepping are disabled until you hit either the left or right arrow key or one of the stepping buttons on the menu. Once an arrow key (Left, Right, Up, Down) has been pressed, the stepping/animation controls on the main window toolbar and the Page Up/Page Down keys will function in that same mode. For example, assume the UP ARROW or DOWN ARROW key is pressed; the menu controls will now operate through the tilts at a fixed time, e.g., you can go to the lowest tilt by selecting the First Frame iconified button.","title":"4-panel Z, ZDR, HC+KDP, CC"},{"location":"cave/d2d-radar/#best-res-base-products","text":"This section is divided into two parts. The upper part lists individual products: four base products and three dual-pol products (ZDR, CC, and KDP). The lower part includes submenus for accessing multiple products and applications. The following describes the submenus grouped in the lower part of the Best Res Base Products section. Precip : In addition to the QPE dual-pol products, this submenu includes the legacy precip products, which include Storm Total, One Hour, Three Hour, and User Selectable precipitation products. A suite of snowfall products is also available on the Precip submenu. All are available for request (OTR, RMR), and the first four can be added to an RPS (Routine Product Set) list. All of these products are available on any scale. Derived Products : The Derived Products submenu includes Layer Reflectivity, Cross Section, and Other products displayed on any scale. Derived products include precipitation, storm (mesocyclone, hail, tornado), and wind derivations. Algorithm Overlays : The Algorithm Overlays submenu includes legacy algorithm overlays and the ML dual-pol overlay. four panel : The four panel submenu includes menu entries for Z+V, Z+SRM 8- and 4-bit, and some other combinations that are presented in 4-panel mode, with a different elevation angle or product in each panel. Data Quality : The Data Quality products, accessible by a pull-right submenu, include Clutter Filter Control and reflectivity and velocity clutter probability products. 4-bit/Legacy Prods : The 4-bit/Legacy Prods submenu uses generic selectors that load 8-bit (256 level) data, with legacy 4-bit (16 level) and 3-bit (8 level) data filling in when no 8-bit data is available. Radar Applications : The Radar Applications submenu provides access to all the radar applications and radar tools.","title":"Best Res Base Products"},{"location":"cave/d2d-radar/#mrms","text":"","title":"MRMS"},{"location":"cave/d2d-radar/#fnexrad-composites","text":"","title":"FNEXRAD Composites"},{"location":"cave/d2d-radar/#dhr","text":"","title":"DHR"},{"location":"cave/d2d-radar/#dlv","text":"","title":"DLV"},{"location":"cave/d2d-radar/#eet","text":"","title":"EET"},{"location":"cave/d2d-radar/#hhc","text":"","title":"HHC"},{"location":"cave/d2d-radar/#daa","text":"","title":"DAA"},{"location":"cave/d2d-radar/#dta","text":"","title":"DTA"},{"location":"cave/d2d-radar/#mosaic-radar-plots","text":"Mosaics available via this menu use data from up to nine nearby radars. Additional optional mosaics on cascading menus provide a limited list of radar products from a predefined set of WSR-88D radars within a given region. Your System Manager or site Administrator can set up such mosaics by: /awips2/edex/data/utility/common_static/site//radar/radarInUse.txt . A mosaicInfo.txt table will only work while logged on to an AWIPS workstation.","title":"Mosaic Radar Plots"},{"location":"cave/d2d-radar/#n0q","text":"","title":"N0Q"},{"location":"cave/d2d-radar/#dsp","text":"","title":"DSP"},{"location":"cave/d2d-radar/#dta_1","text":"","title":"DTA"},{"location":"cave/d2d-radar/#daa_1","text":"","title":"DAA"},{"location":"cave/d2d-radar/#radar-applications","text":"","title":"Radar Applications"},{"location":"cave/d2d-radar/#estimated-actual-velocity-eav","text":"A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.","title":"Estimated Actual Velocity (EAV)"},{"location":"cave/d2d-radar/#four-dimensional-stormcell-investigator-fsi","text":"The Four-dimensional Stormcell Investigator (FSI) was developed by the National Severe Storms Laboratory for its Warning Decision Support System Integrated Information. This technology allows users to create and manipulate dynamic cross-sections (both vertical and at constant altitude), such that one can \u201cslice and dice\u201d storms and view these data in three-dimensions and across time.","title":"Four-dimensional Stormcell Investigator (FSI)"},{"location":"cave/d2d-radar/#v-r-shear","text":"This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click B3 to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click B2 on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"V-R Shear"},{"location":"cave/d2d-satellite/","text":"NOAAport GINI imagery \uf0c1 Uniwisc McIDAS AREA files \uf0c1 VIIRS \uf0c1 VIIRS is one of five instruments onboard the NPP satellite. VIIRS' mission is to collect radiometric imagery in visible and infrared wavelengths of the Earth's surface; this includes observing fires, ice, ocean color, vegetation, clouds, and land and sea surface temperatures, and supplying high-resolution images and data used by meteorologists to assess climate change and improve short-term weather forecasting. The VIIRS submenu option provides VIIRS imagery and moderate band satellite displays for the CONUS, Alaska, and Pacific regions. In addition to accessing the NPP Product VIIRS data via the Satellite menu, the VIIRS Imagery data can also be accessed using the Product Browser . GOES and POES Sounding Data \uf0c1 GOES and POES Sounding Data Availability Plots displays the locations where GOES and POES temperature and moisture profiles are available. These soundings are displayed on a Skew-T/log P chart using the Points tool and the Volume Browser. Soundings from the GOES satellites are made only in relatively cloud-free areas, whereas POES systems produce temperature and moisture soundings in clear and cloudy atmospheres. Each hour, NESDIS provides the latest soundings from GOES East and West. Although the GOES East and West sounders yield soundings over a broad area, the default AWIPS configuration retains soundings only from within each site's Regional CAVE scale domain. POES soundings are generated approximately every 12 hours and have more global coverage. POES Imagery \uf0c1 The POES Imagery section of the Satellite menu contains selectors for IR Window, Visible, 3.7\u00b5, and 11-3.7\u00b5 products. These are viewable on all scales. Sounder Imagery \uf0c1 The products available from the Sounder Imagery submenu are based purely on the imager instruments aboard the GOES East (GE) and GOES West (GW) satellites. Derived Products Imagery \uf0c1 A variety of precipitation products are accessible from the Derived Products submenu. These products are derived from one or more of the various satellites (e.g., DMPS, POES, GOES, and GPS). Descriptions of the products follow. The Blended Rain Rate (formerly Rainfall Rate) product is produced hourly to gather recent rain rate retrievals from passive microwave instruments on six polar-orbiting satellites. The blended rain rate eliminates the bias between those data sets and provides a unified, meteorologically significant rain rate field to weather forecasters. The GOES products derived from the GOES satellite include Lifted Index, Total Precip Water (TPW), Cloud Amount, Cloud Top Height, Skin Temperature, and Low Cloud Base. Because the imagery from these products is based on the GOES sounder instrument, several important differences exist between these products and the other (imager-based) imagery. The main differences are that the resolution is no finer than 10 km, the product update frequency is driven by the sounder instrument (AWIPS receives a set of GOES East/West composite derived product images once per hour), and the aerial coverage is based on that of the sounder scans, which is somewhat less than the aerial coverage provided by the imager. Descriptions of the products follow. Lifted Index is a common measure of instability. Its value is obtained by computing the temperature that air near the ground would have if it were lifted to some higher level (usually around 18,000 feet), and comparing that temperature to the actual temperature at that level. The more negative the value, the more instability there is. Total Precip Water is the vertically integrated water vapor content in a column extending from the earth's surface to the top of the atmosphere. Cloud Amount provides an hourly update of cloud amounts within a geostationary satellite field of view. You can loop through the display to identify increasing/decreasing cloud conditions and trends. Cloud Top Height is the height of the cloud in thousands of feet (base - top). Skin Temperature is the sea surface temperature of the ocean surface water. Low Cloud Base provides nighttime images of fog and low stratus clouds derived from a combination of two GOES IR channels. This product identifies cloud ceilings of <1000 feet and is generated hourly starting between 2042 and 2142 GMT, and ending between 1510 and 1610 GMT the next day. This product is beneficial to the warning and forecast processes specific to aviation and terminal forecasting The Total Precip Water (TPW) value can also be derived from the data sources of DMSP, SSM/I (Defense Meteorological Satellite Program Special Sensor Microwave / Imager), and POES AMSU (POES Advanced Microwave Sounding Unit) satellites, which are accessed from the DMSP SSM/I, and POES AMSU sections of the submenu. Variations of TPW (\"Blended Total Precip Water\" and \"Percent of Normal TPW\") are selectable under the AMSU and SSM/I + GPS section. The Blended Total Precip Water product is a blend of the various data sources of AMSU, SSM/I, and GPS satellites, and can be over water or land. The Percent of Normal TPW product is calculated at various times (hourly, monthly, seasonally, etc.) to determine departures from the normal. From the information obtained, forecasters can predict the chances of having a below average, normal, or above average precipitation in the upcoming months. SSM/I Point Data \uf0c1 SSM/I Point Data plot displays data collected over the course of a day for calculating ocean wind speeds. GOES High Density Winds \uf0c1 GOES High Density Winds submenu has options to display satellite-derived multi-layer winds plots from the IR, Visible, and three Water Vapor channels. In addition, you can display individual layers that display a composite of all the satellite channels. MTSAT High Density Winds \uf0c1 MTSAT High Density Winds cover the Western Pacific. ASCAT winds (25 km) \uf0c1 Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOP-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) are launchable from both the CAVE Satellite menu and the Upper Air menu. The ASCAT instrument generates ocean surface wind retrievals. The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales.","title":"D2d satellite"},{"location":"cave/d2d-satellite/#noaaport-gini-imagery","text":"","title":"NOAAport GINI imagery"},{"location":"cave/d2d-satellite/#uniwisc-mcidas-area-files","text":"","title":"Uniwisc McIDAS AREA files"},{"location":"cave/d2d-satellite/#viirs","text":"VIIRS is one of five instruments onboard the NPP satellite. VIIRS' mission is to collect radiometric imagery in visible and infrared wavelengths of the Earth's surface; this includes observing fires, ice, ocean color, vegetation, clouds, and land and sea surface temperatures, and supplying high-resolution images and data used by meteorologists to assess climate change and improve short-term weather forecasting. The VIIRS submenu option provides VIIRS imagery and moderate band satellite displays for the CONUS, Alaska, and Pacific regions. In addition to accessing the NPP Product VIIRS data via the Satellite menu, the VIIRS Imagery data can also be accessed using the Product Browser .","title":"VIIRS"},{"location":"cave/d2d-satellite/#goes-and-poes-sounding-data","text":"GOES and POES Sounding Data Availability Plots displays the locations where GOES and POES temperature and moisture profiles are available. These soundings are displayed on a Skew-T/log P chart using the Points tool and the Volume Browser. Soundings from the GOES satellites are made only in relatively cloud-free areas, whereas POES systems produce temperature and moisture soundings in clear and cloudy atmospheres. Each hour, NESDIS provides the latest soundings from GOES East and West. Although the GOES East and West sounders yield soundings over a broad area, the default AWIPS configuration retains soundings only from within each site's Regional CAVE scale domain. POES soundings are generated approximately every 12 hours and have more global coverage.","title":"GOES and POES Sounding Data"},{"location":"cave/d2d-satellite/#poes-imagery","text":"The POES Imagery section of the Satellite menu contains selectors for IR Window, Visible, 3.7\u00b5, and 11-3.7\u00b5 products. These are viewable on all scales.","title":"POES Imagery"},{"location":"cave/d2d-satellite/#sounder-imagery","text":"The products available from the Sounder Imagery submenu are based purely on the imager instruments aboard the GOES East (GE) and GOES West (GW) satellites.","title":"Sounder Imagery"},{"location":"cave/d2d-satellite/#derived-products-imagery","text":"A variety of precipitation products are accessible from the Derived Products submenu. These products are derived from one or more of the various satellites (e.g., DMPS, POES, GOES, and GPS). Descriptions of the products follow. The Blended Rain Rate (formerly Rainfall Rate) product is produced hourly to gather recent rain rate retrievals from passive microwave instruments on six polar-orbiting satellites. The blended rain rate eliminates the bias between those data sets and provides a unified, meteorologically significant rain rate field to weather forecasters. The GOES products derived from the GOES satellite include Lifted Index, Total Precip Water (TPW), Cloud Amount, Cloud Top Height, Skin Temperature, and Low Cloud Base. Because the imagery from these products is based on the GOES sounder instrument, several important differences exist between these products and the other (imager-based) imagery. The main differences are that the resolution is no finer than 10 km, the product update frequency is driven by the sounder instrument (AWIPS receives a set of GOES East/West composite derived product images once per hour), and the aerial coverage is based on that of the sounder scans, which is somewhat less than the aerial coverage provided by the imager. Descriptions of the products follow. Lifted Index is a common measure of instability. Its value is obtained by computing the temperature that air near the ground would have if it were lifted to some higher level (usually around 18,000 feet), and comparing that temperature to the actual temperature at that level. The more negative the value, the more instability there is. Total Precip Water is the vertically integrated water vapor content in a column extending from the earth's surface to the top of the atmosphere. Cloud Amount provides an hourly update of cloud amounts within a geostationary satellite field of view. You can loop through the display to identify increasing/decreasing cloud conditions and trends. Cloud Top Height is the height of the cloud in thousands of feet (base - top). Skin Temperature is the sea surface temperature of the ocean surface water. Low Cloud Base provides nighttime images of fog and low stratus clouds derived from a combination of two GOES IR channels. This product identifies cloud ceilings of <1000 feet and is generated hourly starting between 2042 and 2142 GMT, and ending between 1510 and 1610 GMT the next day. This product is beneficial to the warning and forecast processes specific to aviation and terminal forecasting The Total Precip Water (TPW) value can also be derived from the data sources of DMSP, SSM/I (Defense Meteorological Satellite Program Special Sensor Microwave / Imager), and POES AMSU (POES Advanced Microwave Sounding Unit) satellites, which are accessed from the DMSP SSM/I, and POES AMSU sections of the submenu. Variations of TPW (\"Blended Total Precip Water\" and \"Percent of Normal TPW\") are selectable under the AMSU and SSM/I + GPS section. The Blended Total Precip Water product is a blend of the various data sources of AMSU, SSM/I, and GPS satellites, and can be over water or land. The Percent of Normal TPW product is calculated at various times (hourly, monthly, seasonally, etc.) to determine departures from the normal. From the information obtained, forecasters can predict the chances of having a below average, normal, or above average precipitation in the upcoming months.","title":"Derived Products Imagery"},{"location":"cave/d2d-satellite/#ssmi-point-data","text":"SSM/I Point Data plot displays data collected over the course of a day for calculating ocean wind speeds.","title":"SSM/I Point Data"},{"location":"cave/d2d-satellite/#goes-high-density-winds","text":"GOES High Density Winds submenu has options to display satellite-derived multi-layer winds plots from the IR, Visible, and three Water Vapor channels. In addition, you can display individual layers that display a composite of all the satellite channels.","title":"GOES High Density Winds"},{"location":"cave/d2d-satellite/#mtsat-high-density-winds","text":"MTSAT High Density Winds cover the Western Pacific.","title":"MTSAT High Density Winds"},{"location":"cave/d2d-satellite/#ascat-winds-25-km","text":"Scatterometer Winds are obtained from the ASCAT instrument on EUMETSAT's MetOP-A polar orbiting satellite. This instrument sends pulses of radiation to the ocean surface and measures the amount of energy, called backscatter, it receives back. When you sample these observations, the time, satellite ID, wind direction, and wind speed are provided. With the polar orbiting scanning, a given region will generally be sampled about every 12 hours. ASCAT winds (25 km retrieval resolution but interpolated and displayed at 12.5 km resolution) are launchable from both the CAVE Satellite menu and the Upper Air menu. The ASCAT instrument generates ocean surface wind retrievals. The ASCAT Scatterometer Ocean Winds product is displayable on CAVE at all scales.","title":"ASCAT winds (25 km)"},{"location":"cave/d2d-tools/","text":"Display Tools \uf0c1 The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend . Note : To see information about some of the other options in the Tools menu, check out the Radar Tools page. Az/Ran Overlay \uf0c1 This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button. Baselines \uf0c1 Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime. Choose By ID \uf0c1 Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser. Distance Bearing \uf0c1 Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted. Distance Speed \uf0c1 This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker. Distance Scale \uf0c1 Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest. Feature Following Zoom \uf0c1 When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom. Home \uf0c1 Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it. Clicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location. Points \uf0c1 The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option of selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be unhidden using the Points List dialog box, where you would uncheck the checkbox under the \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is selected, a pop-up opens to confirm whether you want to delete the Point. Selecting the \"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to review the list of Points assigned to that group by clicking the arrow next to the group name. Initially, the default set of Points (A-J) are listed in the D2D Group, as shown above. In the Points List dialog box, Points and groups may be dragged into and out of other groups to create or disassemble subgroups. The Points List dialog box also includes three columns. Point Name : Lists the group name and designated Points. Movable : Checking the checkbox adjacent to the Point disables the Point from being moved. Hidden : Checking the checkbox adjacent to the Point hides the Point on the map. Put home cursor \uf0c1 The Put home cursor tool provides an easy way to locate a METAR observation station, a city and state, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location (station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box. Another way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the Station, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location. Range Rings \uf0c1 The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site. Sunset/Sunrise \uf0c1 By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset. Text Window \uf0c1 Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled. Time of Arrival / Lead Time \uf0c1 Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s). Units Calculator \uf0c1 This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box. Text Workstation \uf0c1 By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the Unidata AWIPS version.","title":"Display Tools"},{"location":"cave/d2d-tools/#display-tools","text":"The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend . Note : To see information about some of the other options in the Tools menu, check out the Radar Tools page.","title":"Display Tools"},{"location":"cave/d2d-tools/#azran-overlay","text":"This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button.","title":"Az/Ran Overlay"},{"location":"cave/d2d-tools/#baselines","text":"Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime.","title":"Baselines"},{"location":"cave/d2d-tools/#choose-by-id","text":"Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser.","title":"Choose By ID"},{"location":"cave/d2d-tools/#distance-bearing","text":"Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted.","title":"Distance Bearing"},{"location":"cave/d2d-tools/#distance-speed","text":"This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker.","title":"Distance Speed"},{"location":"cave/d2d-tools/#distance-scale","text":"Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest.","title":"Distance Scale"},{"location":"cave/d2d-tools/#feature-following-zoom","text":"When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom.","title":"Feature Following Zoom"},{"location":"cave/d2d-tools/#home","text":"Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it. Clicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location.","title":"Home"},{"location":"cave/d2d-tools/#points","text":"The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option of selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be unhidden using the Points List dialog box, where you would uncheck the checkbox under the \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is selected, a pop-up opens to confirm whether you want to delete the Point. Selecting the \"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to review the list of Points assigned to that group by clicking the arrow next to the group name. Initially, the default set of Points (A-J) are listed in the D2D Group, as shown above. In the Points List dialog box, Points and groups may be dragged into and out of other groups to create or disassemble subgroups. The Points List dialog box also includes three columns. Point Name : Lists the group name and designated Points. Movable : Checking the checkbox adjacent to the Point disables the Point from being moved. Hidden : Checking the checkbox adjacent to the Point hides the Point on the map.","title":"Points"},{"location":"cave/d2d-tools/#put-home-cursor","text":"The Put home cursor tool provides an easy way to locate a METAR observation station, a city and state, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location (station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box. Another way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the Station, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location.","title":"Put home cursor"},{"location":"cave/d2d-tools/#range-rings","text":"The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site.","title":"Range Rings"},{"location":"cave/d2d-tools/#sunsetsunrise","text":"By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset.","title":"Sunset/Sunrise"},{"location":"cave/d2d-tools/#text-window","text":"Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled.","title":"Text Window"},{"location":"cave/d2d-tools/#time-of-arrival-lead-time","text":"Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s).","title":"Time of Arrival / Lead Time"},{"location":"cave/d2d-tools/#units-calculator","text":"This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box.","title":"Units Calculator"},{"location":"cave/d2d-tools/#text-workstation","text":"By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the Unidata AWIPS version.","title":"Text Workstation"},{"location":"cave/d2d-uair/","text":"The Upper Air dropdown menu provides access to upper air plots, profiler data, radar plan-view and perspective displays of winds, and aircraft and rawinsonde data. Nearby Radiosonde Observations (RAOB) are also included on the menu to provide easy viewing of upper air data. NSHARP Upper Air Soundings \uf0c1 RAOB data is plotted on the standard Skew-T log-p thermodynamic diagram. A small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the main display pane. If you overlay another Skew-T whose location is far from the original sounding location, the reference map updates to show both locations. NUCAPS Soundings \uf0c1 The NOAA Unique CrIS/ATMS Processing System ( NUCAPS ) soundings are derived from processing of CrIS/ATMS data, provides cloud cleared radiances and trace gas that enable increased accuracy in the development of the vertical profile of temperature and water vapor retrievals. By clicking on the individual dots, the forecaster is able to render the sounding for the selected point using the NSHARP plugin . Upper Air Plots \uf0c1 NCEP: 200mb to 850mb RAOB: 150mb to 925mb UKMO 500mb Height \uf0c1 500mb height graphic out to 144 forecast hours. CPC Charts \uf0c1 6-10 day mean 500mb Height 8-14 day mean 500mb Height 6-10 day 500mb Height Anomaly 8-14 day 500mb Height Anomaly NPN Profiler Time-Height \uf0c1 NOAA Profiler Network ( NPN ) observations as a time-series plot. This time-series plugin is also used in the Volume Browser plugin for both grids and observations. NPN Profiler Plot \uf0c1 200hPa-925hPa 1500m-500m AGL Surface Radar VWP Height-Level \uf0c1 15km AGL 14km AGL 13km AGL ... 500m AGL 250m AGL 100m AGL Radar VWP Pressure-Level \uf0c1 200hPa to 925hPa PIREP Aircraft Plot \uf0c1 The Aircraft data includes Low-, Mid-, and High-level Pilot Weather Report (PIREP) observations. The display plots the temperature, aircraft identifier, wind speed and direction, significant weather, and the flight level (in feet). Pilot reports are critical for air safety. Pilots reports on the conditions they are experiencing show up in a matter of minutes on AWIPS. Weather conditions can change quickly, and there is nothing like having a pilot report to provide a bird's eye view of what it is really like up there. PIREPs may validate forecast conditions, or they may describe real-time weather that varies from them. Icing: Low Level, Mid Level, High Level Tubulence: Low Level Mid Level, High Level Aircraft MDCRS \uf0c1 Meteorological Data Collection and Reporting System (MDCRS) data includes plan-view plots for various 5kft layers and ascent/descent soundings. Using the availability plots (Upper Air menu under MDCRS plots) and ACARS Airports from the Maps menu button you can locate airports that have available soundings. ACARS Airports provides an illustration of locations of airports, but it is not necessary to use it. The \"+\" sign means a temperature sounding and the \"*\" means a temperature and dewpoint sounding. To see a sounding at a location, simply press the Points menu button. Several points from letters of the alphabet will appear on the map display. To view a sounding, drag one of the points/letters to a \"+\" or \"*\" location. From the menu bar press Volume and then Browser. From the Volume Browser select MDCRS for Source, Sounding for Fields and select the letter/point on the desired location for Points. Click on your selection in the Product Selection List and then press the Load button to view the sounding. A zoomable inset map (NW corner) is available to show the location of the sounding. When you zoom in by clicking mouse Button 2 (B2), the flight track of the ascent/descent sounding is shown on the map. In addition, you can sample the flight track to see the time and elevation. To zoom out, click mouse Button 1 (B1). This inset map (and also those on var vs. height displays, cross sections, and cell trends) can be suppressed by setting the global density (i.e., from the tool bar) at less than 1. 000-500hft in 50ft increments 1 hour profile availability 6 hour profile availability SIGMET and AIRMET reports: Convective, Icing, Turbulance, Tropical, Volcanic \uf0c1 SIGMET \uf0c1 SIGMET (Significant Meteorological Information) is an alphanumeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450). A SIGMET includes information about the location of the hazard using VOR locations. SIGMETs are produced on an as-needed basis at the AWC and are distributed on the SBN. AIRMET \uf0c1 AIRMET (Airmen's Meteorological Information) is an alpha-numeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450), but not requiring the issuance of a SIGMET. An AIRMET includes information about the location of the hazard using VOR locations. AIRMETs are produced every 6 hours at the AWC for the CONUS area, and are distributed on the SBN. Visibility Products \uf0c1 IFR, Mountain Obscn \uf0c1 Medium Level, High Level \uf0c1","title":"D2d uair"},{"location":"cave/d2d-uair/#nsharp-upper-air-soundings","text":"RAOB data is plotted on the standard Skew-T log-p thermodynamic diagram. A small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the main display pane. If you overlay another Skew-T whose location is far from the original sounding location, the reference map updates to show both locations.","title":"NSHARP Upper Air Soundings"},{"location":"cave/d2d-uair/#nucaps-soundings","text":"The NOAA Unique CrIS/ATMS Processing System ( NUCAPS ) soundings are derived from processing of CrIS/ATMS data, provides cloud cleared radiances and trace gas that enable increased accuracy in the development of the vertical profile of temperature and water vapor retrievals. By clicking on the individual dots, the forecaster is able to render the sounding for the selected point using the NSHARP plugin .","title":"NUCAPS Soundings"},{"location":"cave/d2d-uair/#upper-air-plots","text":"NCEP: 200mb to 850mb RAOB: 150mb to 925mb","title":"Upper Air Plots"},{"location":"cave/d2d-uair/#ukmo-500mb-height","text":"500mb height graphic out to 144 forecast hours.","title":"UKMO 500mb Height"},{"location":"cave/d2d-uair/#cpc-charts","text":"6-10 day mean 500mb Height 8-14 day mean 500mb Height 6-10 day 500mb Height Anomaly 8-14 day 500mb Height Anomaly","title":"CPC Charts"},{"location":"cave/d2d-uair/#npn-profiler-time-height","text":"NOAA Profiler Network ( NPN ) observations as a time-series plot. This time-series plugin is also used in the Volume Browser plugin for both grids and observations.","title":"NPN Profiler Time-Height"},{"location":"cave/d2d-uair/#npn-profiler-plot","text":"200hPa-925hPa 1500m-500m AGL Surface","title":"NPN Profiler Plot"},{"location":"cave/d2d-uair/#radar-vwp-height-level","text":"15km AGL 14km AGL 13km AGL ... 500m AGL 250m AGL 100m AGL","title":"Radar VWP Height-Level"},{"location":"cave/d2d-uair/#radar-vwp-pressure-level","text":"200hPa to 925hPa","title":"Radar VWP Pressure-Level"},{"location":"cave/d2d-uair/#pirep-aircraft-plot","text":"The Aircraft data includes Low-, Mid-, and High-level Pilot Weather Report (PIREP) observations. The display plots the temperature, aircraft identifier, wind speed and direction, significant weather, and the flight level (in feet). Pilot reports are critical for air safety. Pilots reports on the conditions they are experiencing show up in a matter of minutes on AWIPS. Weather conditions can change quickly, and there is nothing like having a pilot report to provide a bird's eye view of what it is really like up there. PIREPs may validate forecast conditions, or they may describe real-time weather that varies from them. Icing: Low Level, Mid Level, High Level Tubulence: Low Level Mid Level, High Level","title":"PIREP Aircraft Plot"},{"location":"cave/d2d-uair/#aircraft-mdcrs","text":"Meteorological Data Collection and Reporting System (MDCRS) data includes plan-view plots for various 5kft layers and ascent/descent soundings. Using the availability plots (Upper Air menu under MDCRS plots) and ACARS Airports from the Maps menu button you can locate airports that have available soundings. ACARS Airports provides an illustration of locations of airports, but it is not necessary to use it. The \"+\" sign means a temperature sounding and the \"*\" means a temperature and dewpoint sounding. To see a sounding at a location, simply press the Points menu button. Several points from letters of the alphabet will appear on the map display. To view a sounding, drag one of the points/letters to a \"+\" or \"*\" location. From the menu bar press Volume and then Browser. From the Volume Browser select MDCRS for Source, Sounding for Fields and select the letter/point on the desired location for Points. Click on your selection in the Product Selection List and then press the Load button to view the sounding. A zoomable inset map (NW corner) is available to show the location of the sounding. When you zoom in by clicking mouse Button 2 (B2), the flight track of the ascent/descent sounding is shown on the map. In addition, you can sample the flight track to see the time and elevation. To zoom out, click mouse Button 1 (B1). This inset map (and also those on var vs. height displays, cross sections, and cell trends) can be suppressed by setting the global density (i.e., from the tool bar) at less than 1. 000-500hft in 50ft increments 1 hour profile availability 6 hour profile availability","title":"Aircraft MDCRS"},{"location":"cave/d2d-uair/#sigmet-and-airmet-reports-convective-icing-turbulance-tropical-volcanic","text":"","title":"SIGMET and AIRMET reports: Convective, Icing, Turbulance, Tropical, Volcanic"},{"location":"cave/d2d-uair/#sigmet","text":"SIGMET (Significant Meteorological Information) is an alphanumeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450). A SIGMET includes information about the location of the hazard using VOR locations. SIGMETs are produced on an as-needed basis at the AWC and are distributed on the SBN.","title":"SIGMET"},{"location":"cave/d2d-uair/#airmet","text":"AIRMET (Airmen's Meteorological Information) is an alpha-numeric message describing specific aviation hazard conditions between the surface and 45,000 feet (FL450), but not requiring the issuance of a SIGMET. An AIRMET includes information about the location of the hazard using VOR locations. AIRMETs are produced every 6 hours at the AWC for the CONUS area, and are distributed on the SBN.","title":"AIRMET"},{"location":"cave/d2d-uair/#visibility-products","text":"","title":"Visibility Products"},{"location":"cave/d2d-uair/#ifr-mountain-obscn","text":"","title":"IFR, Mountain Obscn"},{"location":"cave/d2d-uair/#medium-level-high-level","text":"","title":"Medium Level, High Level"},{"location":"cave/goes-16-17-satellite/","text":"GOES 16/17 \uf0c1 The goesr EDEX decoder supports the ingest of GOES products coming over NOAAPort and Unidata's IDD. These include single channel imagery , derived products (Level 2b netCDF files), gridded Geostationary Lightning Mapper (GLM) products (produced by Eric Bruning at Texas Tech), CIRA created RGB specific products, and vertical temperature/moisture profiles . Using derived parameters, additional RGB and channel difference products can be loaded. The dmw EDEX decoder supports the ingest of GOES derived motion winds . GOES East and West products are accessible in the Satellite menu. The menu is broken into sections starting with common CONUS GOES East/West Combo products. There are submenus for each of the separate geospatial sectors: East Full Disk East CONUS East Mesoscale Sectors (x2) West Full Disk West CONUS West Mesoscale Sectors (x2) Hawaii Alaska Puerto Rico Each sector submenu has products for individual channels and vertical profiles, as well as submenus for derived products, channel differences, RGB Composites, GLM data, and derived motion winds. GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. The RGB products are not available on MacOS or in a Virtual Machine running CAVE. LDM Pattern Actions \uf0c1 The Unidata IDD redistributes both the NOAAPort/SBN GOES tiled products as well as stitched together GOES products. While AWIPS can decode and ingest both, it's important to only be requesting from one or the other so you aren't creating duplicate processing. The entries that should be used for GOES data are shown below which is found in the LDM's pqact.conf file, located in /awips2/ldm/etc . (For the full list of pqact entries, you can view this file). # GOES 16/17 Single Channel (ABI) via Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CMI-IDD/\\5\\6\\7\\8.nc4 # GOES 16/17 derived products + derived motion wind via SBN HDS ^(IXT.[8-9]9) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) NOTHER ^(IXT[WXY]01) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) # GOES 16 GLM Gridded Products via Texas Tech-->Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeostationaryLightningMapper/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\3/\\6/GLM-IDD/\\4\\5\\6\\7.nc4 # GOES CIRA derived products NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeoColor/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/GeoColor/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/DebraDust/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/DebraDust/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudSnow/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/CloudSnow/\\5\\6\\7\\8.nc4 Individual Channels \uf0c1 All geospatial sectors have 16 individual channel products that can be viewed. Below are samples of Channel 14 (11.20\u03bcm) for each of the sectors. East CONUS 1km \uf0c1 East Full Disk 6km \uf0c1 East Mesoscale Sectors (EMESO-1, EMESO-2) \uf0c1 Two floating mesoscale sectors (location will vary day to day from image shown) West CONUS 1km \uf0c1 West Full Disk \uf0c1 West Mesoscale Sectors (WMESO-1, WMESO-2) \uf0c1 Two floating mesoscale sectors (location will vary day to day from image shown) Alaska \uf0c1 Hawaii \uf0c1 Puerto Rico (PRREGI) \uf0c1 RGB Composites \uf0c1 RGB Composites are made by combining 3 channels and are available for each sector. Quite a few new RGB products have been added in Unidata's 18.2.1 release. These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS or within a Virtual Machine OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac or in a Virtual Machine. Please use the Linux or Windows installs to view RGB products. Day Cloud Phase \uf0c1 Fire Temperature \uf0c1 Day Land Cloud \uf0c1 Day Cloud Convection \uf0c1 Day Land Cloud Fires \uf0c1 VIS/IR Sandwich \uf0c1 Simple Water Vapor \uf0c1 Air Mass \uf0c1 Ash \uf0c1 Day Convection \uf0c1 Day Snow Fog \uf0c1 Differential Water Vapor \uf0c1 Dust \uf0c1 CIMSS Natural Color \uf0c1 Nighttime Microphysics \uf0c1 SO2 \uf0c1 CIRA Geocolor \uf0c1 CIRA Debra Dust \uf0c1 CIRA Cloud Snow \uf0c1 Daytime Composite 1 \uf0c1 Daytime Composite 5 \uf0c1 Channel Differences \uf0c1 Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) The rendering of these products uses the Jep package in Python, which has specific install instructions for Windows. Derived Products \uf0c1 Derived products are also known as Level 2+ products. Currently there are only derived products from GOES East available in AWIPS. Each sector has a different set of products available. To find out some more information on some of the products please the Quick Guides compiled by CIRA. These may not all be available for each sector. The current products offered in CAVE are listed below and to the right is which GOES East sector they are available for (F=Full Disk, C=CONUS, M=Mesoscale): Aerosol Detection - F,C,M Aerosol Optical Depth - F,C Clear Sky Mask - F,C,M Cloud Optical Depth - F,C Cloud Particle Size -F,C,M Cloud Top Height -F,C,M Cloud Top Phase -F,C,M Cloud Top Pressure -F,C Cloud Top Temperature - F,M Derived CAPE - F,C,M Derived K-Index - F,C,M Derived Lifted Index - F,C,M Derived Showalter Index - F,C,M Derived Total Totals - F,C,M Fire Area - F,C Fire Power - F,C Fire Temperature - F,C Instrument Flight Rule (IFR) Probability - C Low IFR Probability - C Marginal Visual Flight Rules (MVFR) Probability - C Cloud Thickness - C Land Skin Temperature - F,C,M RR/QPE - F Sea Surface Temperature - F Total Precip Water - F,C,M Geostationary Lightning Mapper (GLM) \uf0c1 Dr. Eric Bruning at Texas Tech has taken the raw GLM data and coded up some new gridded products that can be ingested and displayed in AWIPS. Minimum Flash Area Average Flash Area Flash Extent Density Group Extent Density Total Optical Energy GLM data are located in the menu structure: Satellite > [SECTOR] > GLM Products . You can also access the data from Surface > GLM - Geostationary Lightning Mapper submenus. Derived Motion Winds \uf0c1 Derived Motion Wind Vectors are produced using sequential ABI images and can provide information about winds at different levels. The wind vectors are computed using both visible and infrared imagery. Winds can be plotted by different pressure layers or individual channels. More information can be found here . Below is an image of the winds at different pressure layers. Vertical Temperature and Moisture Profile \uf0c1 Vertical Temperature and Moisture profiles are available in AWIPS. Similar to NUCAPS, when loaded in CAVE, a circle is displayed for each location that has a vertical profile available. When clicking on the circle, NSHARP will open with the vertical temperature and moisture profile. These profiles are GFS data that have been adjusted based on the satellite observations. More information can be found here . HDF5 Data Store \uf0c1 Decoded GOES satellite data are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x awips fxalpha 4096 AKREGI drwxr-xr-x awips fxalpha 4096 Antarctic drwxr-xr-x awips fxalpha 4096 Arctic drwxr-xr-x awips fxalpha 4096 AREA0600 drwxr-xr-x awips fxalpha 4096 AREA0700 drwxr-xr-x awips fxalpha 4096 AREA3100 drwxr-xr-x awips fxalpha 4096 AREA3101 drwxr-xr-x awips fxalpha 12288 ECONUS drwxr-xr-x awips fxalpha 4096 EFD drwxr-xr-x awips fxalpha 4096 EMESO-1 drwxr-xr-x awips fxalpha 4096 EMESO-2 drwxr-xr-x awips fxalpha 4096 HIREGI drwxr-xr-x awips fxalpha 4096 NEXRCOMP drwxr-xr-x awips fxalpha 4096 PRREGI drwxr-xr-x awips fxalpha 4096 WCONUS drwxr-xr-x awips fxalpha 4096 WFD drwxr-xr-x awips fxalpha 4096 WMESO-1 drwxr-xr-x awips fxalpha 4096 WMESO-2","title":"GOES 16/17"},{"location":"cave/goes-16-17-satellite/#goes-1617","text":"The goesr EDEX decoder supports the ingest of GOES products coming over NOAAPort and Unidata's IDD. These include single channel imagery , derived products (Level 2b netCDF files), gridded Geostationary Lightning Mapper (GLM) products (produced by Eric Bruning at Texas Tech), CIRA created RGB specific products, and vertical temperature/moisture profiles . Using derived parameters, additional RGB and channel difference products can be loaded. The dmw EDEX decoder supports the ingest of GOES derived motion winds . GOES East and West products are accessible in the Satellite menu. The menu is broken into sections starting with common CONUS GOES East/West Combo products. There are submenus for each of the separate geospatial sectors: East Full Disk East CONUS East Mesoscale Sectors (x2) West Full Disk West CONUS West Mesoscale Sectors (x2) Hawaii Alaska Puerto Rico Each sector submenu has products for individual channels and vertical profiles, as well as submenus for derived products, channel differences, RGB Composites, GLM data, and derived motion winds. GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. The RGB products are not available on MacOS or in a Virtual Machine running CAVE.","title":"GOES 16/17"},{"location":"cave/goes-16-17-satellite/#ldm-pattern-actions","text":"The Unidata IDD redistributes both the NOAAPort/SBN GOES tiled products as well as stitched together GOES products. While AWIPS can decode and ingest both, it's important to only be requesting from one or the other so you aren't creating duplicate processing. The entries that should be used for GOES data are shown below which is found in the LDM's pqact.conf file, located in /awips2/ldm/etc . (For the full list of pqact entries, you can view this file). # GOES 16/17 Single Channel (ABI) via Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CMI-IDD/\\5\\6\\7\\8.nc4 # GOES 16/17 derived products + derived motion wind via SBN HDS ^(IXT.[8-9]9) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) NOTHER ^(IXT[WXY]01) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) # GOES 16 GLM Gridded Products via Texas Tech-->Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeostationaryLightningMapper/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\3/\\6/GLM-IDD/\\4\\5\\6\\7.nc4 # GOES CIRA derived products NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeoColor/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/GeoColor/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/DebraDust/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/DebraDust/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudSnow/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/CloudSnow/\\5\\6\\7\\8.nc4","title":"LDM Pattern Actions"},{"location":"cave/goes-16-17-satellite/#individual-channels","text":"All geospatial sectors have 16 individual channel products that can be viewed. Below are samples of Channel 14 (11.20\u03bcm) for each of the sectors.","title":"Individual Channels"},{"location":"cave/goes-16-17-satellite/#east-conus-1km","text":"","title":"East CONUS 1km"},{"location":"cave/goes-16-17-satellite/#east-full-disk-6km","text":"","title":"East Full Disk 6km"},{"location":"cave/goes-16-17-satellite/#east-mesoscale-sectors-emeso-1-emeso-2","text":"Two floating mesoscale sectors (location will vary day to day from image shown)","title":"East Mesoscale Sectors (EMESO-1, EMESO-2)"},{"location":"cave/goes-16-17-satellite/#west-conus-1km","text":"","title":"West CONUS 1km"},{"location":"cave/goes-16-17-satellite/#west-full-disk","text":"","title":"West Full Disk"},{"location":"cave/goes-16-17-satellite/#west-mesoscale-sectors-wmeso-1-wmeso-2","text":"Two floating mesoscale sectors (location will vary day to day from image shown)","title":"West Mesoscale Sectors (WMESO-1, WMESO-2)"},{"location":"cave/goes-16-17-satellite/#alaska","text":"","title":"Alaska"},{"location":"cave/goes-16-17-satellite/#hawaii","text":"","title":"Hawaii"},{"location":"cave/goes-16-17-satellite/#puerto-rico-prregi","text":"","title":"Puerto Rico (PRREGI)"},{"location":"cave/goes-16-17-satellite/#rgb-composites","text":"RGB Composites are made by combining 3 channels and are available for each sector. Quite a few new RGB products have been added in Unidata's 18.2.1 release. These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS or within a Virtual Machine OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac or in a Virtual Machine. Please use the Linux or Windows installs to view RGB products.","title":"RGB Composites"},{"location":"cave/goes-16-17-satellite/#day-cloud-phase","text":"","title":"Day Cloud Phase"},{"location":"cave/goes-16-17-satellite/#fire-temperature","text":"","title":"Fire Temperature"},{"location":"cave/goes-16-17-satellite/#day-land-cloud","text":"","title":"Day Land Cloud"},{"location":"cave/goes-16-17-satellite/#day-cloud-convection","text":"","title":"Day Cloud Convection"},{"location":"cave/goes-16-17-satellite/#day-land-cloud-fires","text":"","title":"Day Land Cloud Fires"},{"location":"cave/goes-16-17-satellite/#visir-sandwich","text":"","title":"VIS/IR Sandwich"},{"location":"cave/goes-16-17-satellite/#simple-water-vapor","text":"","title":"Simple Water Vapor"},{"location":"cave/goes-16-17-satellite/#air-mass","text":"","title":"Air Mass"},{"location":"cave/goes-16-17-satellite/#ash","text":"","title":"Ash"},{"location":"cave/goes-16-17-satellite/#day-convection","text":"","title":"Day Convection"},{"location":"cave/goes-16-17-satellite/#day-snow-fog","text":"","title":"Day Snow Fog"},{"location":"cave/goes-16-17-satellite/#differential-water-vapor","text":"","title":"Differential Water Vapor"},{"location":"cave/goes-16-17-satellite/#dust","text":"","title":"Dust"},{"location":"cave/goes-16-17-satellite/#cimss-natural-color","text":"","title":"CIMSS Natural Color"},{"location":"cave/goes-16-17-satellite/#nighttime-microphysics","text":"","title":"Nighttime Microphysics"},{"location":"cave/goes-16-17-satellite/#so2","text":"","title":"SO2"},{"location":"cave/goes-16-17-satellite/#cira-geocolor","text":"","title":"CIRA Geocolor"},{"location":"cave/goes-16-17-satellite/#cira-debra-dust","text":"","title":"CIRA Debra Dust"},{"location":"cave/goes-16-17-satellite/#cira-cloud-snow","text":"","title":"CIRA Cloud Snow"},{"location":"cave/goes-16-17-satellite/#daytime-composite-1","text":"","title":"Daytime Composite 1"},{"location":"cave/goes-16-17-satellite/#daytime-composite-5","text":"","title":"Daytime Composite 5"},{"location":"cave/goes-16-17-satellite/#channel-differences","text":"Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) The rendering of these products uses the Jep package in Python, which has specific install instructions for Windows.","title":"Channel Differences"},{"location":"cave/goes-16-17-satellite/#derived-products","text":"Derived products are also known as Level 2+ products. Currently there are only derived products from GOES East available in AWIPS. Each sector has a different set of products available. To find out some more information on some of the products please the Quick Guides compiled by CIRA. These may not all be available for each sector. The current products offered in CAVE are listed below and to the right is which GOES East sector they are available for (F=Full Disk, C=CONUS, M=Mesoscale): Aerosol Detection - F,C,M Aerosol Optical Depth - F,C Clear Sky Mask - F,C,M Cloud Optical Depth - F,C Cloud Particle Size -F,C,M Cloud Top Height -F,C,M Cloud Top Phase -F,C,M Cloud Top Pressure -F,C Cloud Top Temperature - F,M Derived CAPE - F,C,M Derived K-Index - F,C,M Derived Lifted Index - F,C,M Derived Showalter Index - F,C,M Derived Total Totals - F,C,M Fire Area - F,C Fire Power - F,C Fire Temperature - F,C Instrument Flight Rule (IFR) Probability - C Low IFR Probability - C Marginal Visual Flight Rules (MVFR) Probability - C Cloud Thickness - C Land Skin Temperature - F,C,M RR/QPE - F Sea Surface Temperature - F Total Precip Water - F,C,M","title":"Derived Products"},{"location":"cave/goes-16-17-satellite/#geostationary-lightning-mapper-glm","text":"Dr. Eric Bruning at Texas Tech has taken the raw GLM data and coded up some new gridded products that can be ingested and displayed in AWIPS. Minimum Flash Area Average Flash Area Flash Extent Density Group Extent Density Total Optical Energy GLM data are located in the menu structure: Satellite > [SECTOR] > GLM Products . You can also access the data from Surface > GLM - Geostationary Lightning Mapper submenus.","title":"Geostationary Lightning Mapper (GLM)"},{"location":"cave/goes-16-17-satellite/#derived-motion-winds","text":"Derived Motion Wind Vectors are produced using sequential ABI images and can provide information about winds at different levels. The wind vectors are computed using both visible and infrared imagery. Winds can be plotted by different pressure layers or individual channels. More information can be found here . Below is an image of the winds at different pressure layers.","title":"Derived Motion Winds"},{"location":"cave/goes-16-17-satellite/#vertical-temperature-and-moisture-profile","text":"Vertical Temperature and Moisture profiles are available in AWIPS. Similar to NUCAPS, when loaded in CAVE, a circle is displayed for each location that has a vertical profile available. When clicking on the circle, NSHARP will open with the vertical temperature and moisture profile. These profiles are GFS data that have been adjusted based on the satellite observations. More information can be found here .","title":"Vertical Temperature and Moisture Profile"},{"location":"cave/goes-16-17-satellite/#hdf5-data-store","text":"Decoded GOES satellite data are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x awips fxalpha 4096 AKREGI drwxr-xr-x awips fxalpha 4096 Antarctic drwxr-xr-x awips fxalpha 4096 Arctic drwxr-xr-x awips fxalpha 4096 AREA0600 drwxr-xr-x awips fxalpha 4096 AREA0700 drwxr-xr-x awips fxalpha 4096 AREA3100 drwxr-xr-x awips fxalpha 4096 AREA3101 drwxr-xr-x awips fxalpha 12288 ECONUS drwxr-xr-x awips fxalpha 4096 EFD drwxr-xr-x awips fxalpha 4096 EMESO-1 drwxr-xr-x awips fxalpha 4096 EMESO-2 drwxr-xr-x awips fxalpha 4096 HIREGI drwxr-xr-x awips fxalpha 4096 NEXRCOMP drwxr-xr-x awips fxalpha 4096 PRREGI drwxr-xr-x awips fxalpha 4096 WCONUS drwxr-xr-x awips fxalpha 4096 WFD drwxr-xr-x awips fxalpha 4096 WMESO-1 drwxr-xr-x awips fxalpha 4096 WMESO-2","title":"HDF5 Data Store"},{"location":"cave/hazard-services-alert/","text":"Alerts \uf0c1","title":"Hazard services alert"},{"location":"cave/hazard-services-alert/#alerts","text":"","title":"Alerts"},{"location":"cave/hazard-services-create/","text":"Hazard Creation Methods \uf0c1 Recommender Execution \uf0c1 Recommender Output \uf0c1 River Flood Recommender \uf0c1 Flash Flood Recommender \uf0c1 Storm Track Recommender \uf0c1 Dam/Levee Break Flood Recommender \uf0c1 Burn Scar Recommender \uf0c1 Creating a Hazard from a River Gauge \uf0c1 Selection Tools \uf0c1 Select By Area \uf0c1 Freehand Drawing \uf0c1 Manipulating Hazards \uf0c1 Adjusting a Hazard Polygon \uf0c1 Moving a Polygon Vertex \uf0c1 Deleting a Polygon Vertex \uf0c1 Adding a Polygon Vertex \uf0c1 Moving a Hazard Geometry \uf0c1 Hazard Information Dialog \uf0c1 Hazard Type \uf0c1 Time Range \uf0c1 Details (Metadata) \uf0c1 Hazard Status \uf0c1 #Propose \uf0c1 Preview \uf0c1 Product Staging Dialog \uf0c1 Product Editor \uf0c1 Issue \uf0c1 Ending and Ended \uf0c1","title":"Hazard services create"},{"location":"cave/hazard-services-create/#hazard-creation-methods","text":"","title":"Hazard Creation Methods"},{"location":"cave/hazard-services-create/#recommender-execution","text":"","title":"Recommender Execution"},{"location":"cave/hazard-services-create/#recommender-output","text":"","title":"Recommender Output"},{"location":"cave/hazard-services-create/#river-flood-recommender","text":"","title":"River Flood Recommender"},{"location":"cave/hazard-services-create/#flash-flood-recommender","text":"","title":"Flash Flood Recommender"},{"location":"cave/hazard-services-create/#storm-track-recommender","text":"","title":"Storm Track Recommender"},{"location":"cave/hazard-services-create/#damlevee-break-flood-recommender","text":"","title":"Dam/Levee Break Flood Recommender"},{"location":"cave/hazard-services-create/#burn-scar-recommender","text":"","title":"Burn Scar Recommender"},{"location":"cave/hazard-services-create/#creating-a-hazard-from-a-river-gauge","text":"","title":"Creating a Hazard from a River Gauge"},{"location":"cave/hazard-services-create/#selection-tools","text":"","title":"Selection Tools"},{"location":"cave/hazard-services-create/#select-by-area","text":"","title":"Select By Area"},{"location":"cave/hazard-services-create/#freehand-drawing","text":"","title":"Freehand Drawing"},{"location":"cave/hazard-services-create/#manipulating-hazards","text":"","title":"Manipulating Hazards"},{"location":"cave/hazard-services-create/#adjusting-a-hazard-polygon","text":"","title":"Adjusting a Hazard Polygon"},{"location":"cave/hazard-services-create/#moving-a-polygon-vertex","text":"","title":"Moving a Polygon Vertex"},{"location":"cave/hazard-services-create/#deleting-a-polygon-vertex","text":"","title":"Deleting a Polygon Vertex"},{"location":"cave/hazard-services-create/#adding-a-polygon-vertex","text":"","title":"Adding a Polygon Vertex"},{"location":"cave/hazard-services-create/#moving-a-hazard-geometry","text":"","title":"Moving a Hazard Geometry"},{"location":"cave/hazard-services-create/#hazard-information-dialog","text":"","title":"Hazard Information Dialog"},{"location":"cave/hazard-services-create/#hazard-type","text":"","title":"Hazard Type"},{"location":"cave/hazard-services-create/#time-range","text":"","title":"Time Range"},{"location":"cave/hazard-services-create/#details-metadata","text":"","title":"Details (Metadata)"},{"location":"cave/hazard-services-create/#hazard-status","text":"","title":"Hazard Status"},{"location":"cave/hazard-services-create/#propose","text":"","title":"#Propose"},{"location":"cave/hazard-services-create/#preview","text":"","title":"Preview"},{"location":"cave/hazard-services-create/#product-staging-dialog","text":"","title":"Product Staging Dialog"},{"location":"cave/hazard-services-create/#product-editor","text":"","title":"Product Editor"},{"location":"cave/hazard-services-create/#issue","text":"","title":"Issue"},{"location":"cave/hazard-services-create/#ending-and-ended","text":"","title":"Ending and Ended"},{"location":"cave/hazard-services-display/","text":"AWIPS Hazard Service Display \uf0c1 Hazard Services is a collection of AWIPS applications used by forecasters to create, update, and manage hazards, replacing and unifying hazard generation capabilities. WarnGen RiverPro GHG etc. In addition to providing a seamless forecast process for generating short-fused, long-fused, and hydrologic hazards, Hazard Services allows the forecaster to focus on the meteorology of the hazard situation, letting the system take on more of the responsibility for the generation and dissemination of products. Launching Hazard Services \uf0c1 Hazard Services can be launched from the various CAVE perspectives by selection the toolbar item \"Hazards\". When Hazard Services is first started, the Console and the Spatial Display are visible. Spatial Display and Console \uf0c1 The Spatial Display is the Hazard Services drawing layer which is loaded into the CAVE Map Editor when Hazard Services is started. It is the Hazard Services map, displaying hazard areas relative to geopolitical boundaries and handling hazard drawing and editing. Its presence is indicated by the 'Hazard Services (Editable)' line in the CAVE Map Legend, and it supports operations common to other AWIPS drawing layers. The Console is the main control panel for Hazard Services. It is always displayed if Hazard Services is running. Closing it closes Hazard Services as well. The Console is a CAVE View, by default docked within the main window. The Console includes a toolbar and a drop-down (\"view\") menu to the right of or just under its title tab. Below these is the table of hazard events. Hazard Services Toolbar \uf0c1 Hydro \uf0c1 The leftmost icon on the tool bar is an indicator if Hydro hazards are being worked or not (it will turn yellow if any active hazards are hidden from view by a filter). Setup (Settings) \uf0c1 Allows you to filter displayed hazard information to focus on the meteorological situation of concern. For example, you may want to focus only on hydrological hazards in a particular time scale and over a particular area. The Settings drop-down menu allows you to select an existing Setting or a recently-used Setting, create a new Setting, edit the current Setting, or delete the current (User) Setting. As new Settings are created, they are added to this drop-down list. The Console\u2019s title tab shows the name of the currently loaded Setting. Settings can also be viewed and edited in the Localization Perspective . Filters \uf0c1 Allows quick modification of the filters being used by the current Setting. Events may be filtered by Hazard Type, Site ID, and/or Status. As the filters are altered, the Hazard Event Table contents change to include only those hazards that pass the filters. For example, with a number of potential events possible, you can select a couple of interest, move them to pending state, and propose one. To reduce clutter in the Console you can hide potentials using the Filters menu, so that all potential events are still present but hidden in both the Console and the Spatial Display. Recommenders (Tools) \uf0c1 The Tools button reveals a drop-down menu listing all the recommenders and other tools available in the current Setting. Recommenders may be run from this menu. When you select a Setting, this menu is populated with appropriate content. Products \uf0c1 Generate RVS\u200b With an FL.x hazard selected in the Console, select this item to bring up a dialog to write an RVS text product. Correct Product\u200b Selecting the Correct Product option provides a list of products that may be corrected. The dialog includes seven columns: Product Category, Issue Time, Event IDs, Hazard Type, VTEC, Expiration Time, and User Name. You can click in a column header to order by, or type in the Search box at the bottom. Upon selecting an item from the list, the Hazard Information Dialog launches. View Product\u200b This option allows you to review issued products, selecting from a list in a Select Product to View dialog. Use the dialog to select the product type (using click, Ctrl-click, Shift-click), then click and select View Product or double-click to see the legacy text. A similar dialog will be produced by selecting the View Products for Selected Events item from the Console pop-up. In this case, the Filter/Query section is not needed, so you\u2019ll see just the lower portion of the illustrated dialog. Spatial Display Modes \uf0c1 When Hazard Services is in Editable state, three buttons set the mode of the Spatial Display, governing how it interprets mouse clicks. Drawing Tools \uf0c1 This menu has six choices: Draw Polygon\u200b When set, mouse clicks on the Spatial Display draw polygons, one click per node (MB1 click to place a node, MB3 click to complete the polygon). AddTo Polygon\u200b If a polygon is active (hazard selected), this choice allows you to augment the area or create a new separate area that will be logically joined with the current polygon. Example of the latter: Note how the single hazard now comprises two polygons. (When you select Preview, these will be joined into a single polygon for issuance.) Draw Freehand Polygon\u200b When set, mouse clicks on the Spatial Display draw freehand polygons (MB1 press, drag, and release to draw the polygon's outline). Note that issued text products will conform to current rules limiting polygon vertices to 20 or snapping areas to counties or zones. The freehand, many-vertex, shapes will be modified at some point during the hazard-issuance workflow. AddTo Freehand Polygon\u200b Similar to AddTo Polygon, but drawing is freehand. Note that you can augment both \u201csegments\u201d and freehand polygons with either of the AddTo tools. Remove Polygon Vertices\u200b In the case where you have a polygon with many vertices, it is very difficult to modify a boundary. This tool will remove a section of vertices to make the problem more tractable. With the tool selected, drag with MB1 to enclose a segment of the polygon. When you release, those vertices will be removed. Remove Polygon Area\u200b This tool provides a way to remove sections of a geometry. Press MB1 and drag out an area that intersects your geometry. Upon release, the intersection area will be removed with the new boundary along the curve you drew. If more than one hazard is selected in the Console, only Draw Polygon and Draw Freehand Polygon are available. The others are invalid and dimmed. Select Event \uf0c1 This radio button sets the mode to event selection. When set, mouse clicks on the Spatial Display select hazard events, and drags cause panning. This is the default mode choice of this set of radio buttons. Pan \uf0c1 This radio button sets the mode to pan mode. When clicked, you can pan the map without inadvertently moving or selecting polygons. Maps for Select by Area \uf0c1 The Maps for Select by Area button reveals a drop-down menu allowing the selection of maps that may be used for selecting by area within the Spatial Display. If the button is disabled, no maps that allow select-by-area are currently loaded. If the button is enabled, but a map menu item within the drop-down menu is disabled, that map is loaded but is currently invisible. Temporal Controls \uf0c1 There are two buttons used to control the Timeline view at the right side of the Hazard Table. You can also zoom and pan the Timeline using the mouse. Selected Time Mode This options menu allows you to select the time mode, either a single time or range of times. Show Current Time \u200bThis button moves the Timeline so that the current time is visible toward its left end. View Menu \uf0c1 The View menu is a drop-down menu holding menu items for functions that in general are less frequently used than those available via the toolbar.","title":"AWIPS Hazard Service Display"},{"location":"cave/hazard-services-display/#awips-hazard-service-display","text":"Hazard Services is a collection of AWIPS applications used by forecasters to create, update, and manage hazards, replacing and unifying hazard generation capabilities. WarnGen RiverPro GHG etc. In addition to providing a seamless forecast process for generating short-fused, long-fused, and hydrologic hazards, Hazard Services allows the forecaster to focus on the meteorology of the hazard situation, letting the system take on more of the responsibility for the generation and dissemination of products.","title":"AWIPS Hazard Service Display"},{"location":"cave/hazard-services-display/#launching-hazard-services","text":"Hazard Services can be launched from the various CAVE perspectives by selection the toolbar item \"Hazards\". When Hazard Services is first started, the Console and the Spatial Display are visible.","title":"Launching Hazard Services"},{"location":"cave/hazard-services-display/#spatial-display-and-console","text":"The Spatial Display is the Hazard Services drawing layer which is loaded into the CAVE Map Editor when Hazard Services is started. It is the Hazard Services map, displaying hazard areas relative to geopolitical boundaries and handling hazard drawing and editing. Its presence is indicated by the 'Hazard Services (Editable)' line in the CAVE Map Legend, and it supports operations common to other AWIPS drawing layers. The Console is the main control panel for Hazard Services. It is always displayed if Hazard Services is running. Closing it closes Hazard Services as well. The Console is a CAVE View, by default docked within the main window. The Console includes a toolbar and a drop-down (\"view\") menu to the right of or just under its title tab. Below these is the table of hazard events.","title":"Spatial Display and Console"},{"location":"cave/hazard-services-display/#hazard-services-toolbar","text":"","title":"Hazard Services Toolbar"},{"location":"cave/hazard-services-display/#hydro","text":"The leftmost icon on the tool bar is an indicator if Hydro hazards are being worked or not (it will turn yellow if any active hazards are hidden from view by a filter).","title":"Hydro"},{"location":"cave/hazard-services-display/#setup-settings","text":"Allows you to filter displayed hazard information to focus on the meteorological situation of concern. For example, you may want to focus only on hydrological hazards in a particular time scale and over a particular area. The Settings drop-down menu allows you to select an existing Setting or a recently-used Setting, create a new Setting, edit the current Setting, or delete the current (User) Setting. As new Settings are created, they are added to this drop-down list. The Console\u2019s title tab shows the name of the currently loaded Setting. Settings can also be viewed and edited in the Localization Perspective .","title":"Setup (Settings)"},{"location":"cave/hazard-services-display/#filters","text":"Allows quick modification of the filters being used by the current Setting. Events may be filtered by Hazard Type, Site ID, and/or Status. As the filters are altered, the Hazard Event Table contents change to include only those hazards that pass the filters. For example, with a number of potential events possible, you can select a couple of interest, move them to pending state, and propose one. To reduce clutter in the Console you can hide potentials using the Filters menu, so that all potential events are still present but hidden in both the Console and the Spatial Display.","title":"Filters"},{"location":"cave/hazard-services-display/#recommenders-tools","text":"The Tools button reveals a drop-down menu listing all the recommenders and other tools available in the current Setting. Recommenders may be run from this menu. When you select a Setting, this menu is populated with appropriate content.","title":"Recommenders (Tools)"},{"location":"cave/hazard-services-display/#products","text":"Generate RVS\u200b With an FL.x hazard selected in the Console, select this item to bring up a dialog to write an RVS text product. Correct Product\u200b Selecting the Correct Product option provides a list of products that may be corrected. The dialog includes seven columns: Product Category, Issue Time, Event IDs, Hazard Type, VTEC, Expiration Time, and User Name. You can click in a column header to order by, or type in the Search box at the bottom. Upon selecting an item from the list, the Hazard Information Dialog launches. View Product\u200b This option allows you to review issued products, selecting from a list in a Select Product to View dialog. Use the dialog to select the product type (using click, Ctrl-click, Shift-click), then click and select View Product or double-click to see the legacy text. A similar dialog will be produced by selecting the View Products for Selected Events item from the Console pop-up. In this case, the Filter/Query section is not needed, so you\u2019ll see just the lower portion of the illustrated dialog.","title":"Products"},{"location":"cave/hazard-services-display/#spatial-display-modes","text":"When Hazard Services is in Editable state, three buttons set the mode of the Spatial Display, governing how it interprets mouse clicks.","title":"Spatial Display Modes"},{"location":"cave/hazard-services-display/#drawing-tools","text":"This menu has six choices: Draw Polygon\u200b When set, mouse clicks on the Spatial Display draw polygons, one click per node (MB1 click to place a node, MB3 click to complete the polygon). AddTo Polygon\u200b If a polygon is active (hazard selected), this choice allows you to augment the area or create a new separate area that will be logically joined with the current polygon. Example of the latter: Note how the single hazard now comprises two polygons. (When you select Preview, these will be joined into a single polygon for issuance.) Draw Freehand Polygon\u200b When set, mouse clicks on the Spatial Display draw freehand polygons (MB1 press, drag, and release to draw the polygon's outline). Note that issued text products will conform to current rules limiting polygon vertices to 20 or snapping areas to counties or zones. The freehand, many-vertex, shapes will be modified at some point during the hazard-issuance workflow. AddTo Freehand Polygon\u200b Similar to AddTo Polygon, but drawing is freehand. Note that you can augment both \u201csegments\u201d and freehand polygons with either of the AddTo tools. Remove Polygon Vertices\u200b In the case where you have a polygon with many vertices, it is very difficult to modify a boundary. This tool will remove a section of vertices to make the problem more tractable. With the tool selected, drag with MB1 to enclose a segment of the polygon. When you release, those vertices will be removed. Remove Polygon Area\u200b This tool provides a way to remove sections of a geometry. Press MB1 and drag out an area that intersects your geometry. Upon release, the intersection area will be removed with the new boundary along the curve you drew. If more than one hazard is selected in the Console, only Draw Polygon and Draw Freehand Polygon are available. The others are invalid and dimmed.","title":"Drawing Tools"},{"location":"cave/hazard-services-display/#select-event","text":"This radio button sets the mode to event selection. When set, mouse clicks on the Spatial Display select hazard events, and drags cause panning. This is the default mode choice of this set of radio buttons.","title":"Select Event"},{"location":"cave/hazard-services-display/#pan","text":"This radio button sets the mode to pan mode. When clicked, you can pan the map without inadvertently moving or selecting polygons.","title":"Pan"},{"location":"cave/hazard-services-display/#maps-for-select-by-area","text":"The Maps for Select by Area button reveals a drop-down menu allowing the selection of maps that may be used for selecting by area within the Spatial Display. If the button is disabled, no maps that allow select-by-area are currently loaded. If the button is enabled, but a map menu item within the drop-down menu is disabled, that map is loaded but is currently invisible.","title":"Maps for Select by Area"},{"location":"cave/hazard-services-display/#temporal-controls","text":"There are two buttons used to control the Timeline view at the right side of the Hazard Table. You can also zoom and pan the Timeline using the mouse. Selected Time Mode This options menu allows you to select the time mode, either a single time or range of times. Show Current Time \u200bThis button moves the Timeline so that the current time is visible toward its left end.","title":"Temporal Controls"},{"location":"cave/hazard-services-display/#view-menu","text":"The View menu is a drop-down menu holding menu items for functions that in general are less frequently used than those available via the toolbar.","title":"View Menu"},{"location":"cave/hazard-services-example/","text":"Hazard Life Cycle \uf0c1 Transition from Product Centric toward Information Centric \uf0c1 Examples of Creating, Continuing, and Ending Hazards \uf0c1","title":"Hazard services example"},{"location":"cave/hazard-services-example/#hazard-life-cycle","text":"","title":"Hazard Life Cycle"},{"location":"cave/hazard-services-example/#transition-from-product-centric-toward-information-centric","text":"","title":"Transition from Product Centric toward Information Centric"},{"location":"cave/hazard-services-example/#examples-of-creating-continuing-and-ending-hazards","text":"","title":"Examples of Creating, Continuing, and Ending Hazards"},{"location":"cave/hazard-services-settings/","text":"Hazard Settings \uf0c1 Change Site \uf0c1 Check Hazard Conflicts \uf0c1 Auto Check Hazard Conflicts \uf0c1 Add To Selected \uf0c1 Show Hatched Areas \uf0c1 Change VTEC Mode \uf0c1 Reset Events \uf0c1 Hazard Event Table \uf0c1 Column Headers \uf0c1 Non-Timeline Headers Timeline Header Table Rows \uf0c1 Hazard History \uf0c1 Settings Overview \uf0c1 Settings Menu \uf0c1 Settings Dialog \uf0c1 Hazards Filter Tab \uf0c1 Console Tab \uf0c1 Console Coloring Tab \uf0c1 HID/Spatial Tab \uf0c1 Recommenders Tab \uf0c1 Maps/Overlays Tab \uf0c1","title":"Hazard Settings"},{"location":"cave/hazard-services-settings/#hazard-settings","text":"","title":"Hazard Settings"},{"location":"cave/hazard-services-settings/#change-site","text":"","title":"Change Site"},{"location":"cave/hazard-services-settings/#check-hazard-conflicts","text":"","title":"Check Hazard Conflicts"},{"location":"cave/hazard-services-settings/#auto-check-hazard-conflicts","text":"","title":"Auto Check Hazard Conflicts"},{"location":"cave/hazard-services-settings/#add-to-selected","text":"","title":"Add To Selected"},{"location":"cave/hazard-services-settings/#show-hatched-areas","text":"","title":"Show Hatched Areas"},{"location":"cave/hazard-services-settings/#change-vtec-mode","text":"","title":"Change VTEC Mode"},{"location":"cave/hazard-services-settings/#reset-events","text":"","title":"Reset Events"},{"location":"cave/hazard-services-settings/#hazard-event-table","text":"","title":"Hazard Event Table"},{"location":"cave/hazard-services-settings/#column-headers","text":"Non-Timeline Headers Timeline Header","title":"Column Headers"},{"location":"cave/hazard-services-settings/#table-rows","text":"","title":"Table Rows"},{"location":"cave/hazard-services-settings/#hazard-history","text":"","title":"Hazard History"},{"location":"cave/hazard-services-settings/#settings-overview","text":"","title":"Settings Overview"},{"location":"cave/hazard-services-settings/#settings-menu","text":"","title":"Settings Menu"},{"location":"cave/hazard-services-settings/#settings-dialog","text":"","title":"Settings Dialog"},{"location":"cave/hazard-services-settings/#hazards-filter-tab","text":"","title":"Hazards Filter Tab"},{"location":"cave/hazard-services-settings/#console-tab","text":"","title":"Console Tab"},{"location":"cave/hazard-services-settings/#console-coloring-tab","text":"","title":"Console Coloring Tab"},{"location":"cave/hazard-services-settings/#hidspatial-tab","text":"","title":"HID/Spatial Tab"},{"location":"cave/hazard-services-settings/#recommenders-tab","text":"","title":"Recommenders Tab"},{"location":"cave/hazard-services-settings/#mapsoverlays-tab","text":"","title":"Maps/Overlays Tab"},{"location":"cave/import-export/","text":"Import/Export \uf0c1 Export Images/GIFs \uf0c1 The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF. Note : This functionality does not currently work on Mac OS because it implements OGL libraries which are not compatible on Mac. Export KML \uf0c1 The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames, the current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in the Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be exported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black tiles when loaded in Google Earth. CAVE Import Formats \uf0c1 CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays CAVE Export Formats \uf0c1 CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...","title":"Import/Export"},{"location":"cave/import-export/#importexport","text":"","title":"Import/Export"},{"location":"cave/import-export/#export-imagesgifs","text":"The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF. Note : This functionality does not currently work on Mac OS because it implements OGL libraries which are not compatible on Mac.","title":"Export Images/GIFs"},{"location":"cave/import-export/#export-kml","text":"The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames, the current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in the Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be exported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black tiles when loaded in Google Earth.","title":"Export KML"},{"location":"cave/import-export/#cave-import-formats","text":"CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays","title":"CAVE Import Formats"},{"location":"cave/import-export/#cave-export-formats","text":"CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...","title":"CAVE Export Formats"},{"location":"cave/localization-perspective/","text":"Localization perspective \uf0c1 Localization Levels \uf0c1 AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base Localization Editor \uf0c1 The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor. Customizing CAVE Menus \uf0c1 Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.","title":"Localization Perspective"},{"location":"cave/localization-perspective/#localization-perspective","text":"","title":"Localization perspective"},{"location":"cave/localization-perspective/#localization-levels","text":"AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base","title":"Localization Levels"},{"location":"cave/localization-perspective/#localization-editor","text":"The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor.","title":"Localization Editor"},{"location":"cave/localization-perspective/#customizing-cave-menus","text":"Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.","title":"Customizing CAVE Menus"},{"location":"cave/maps-views-projections/","text":"Maps, Views, Projections \uf0c1 Default Map Scales \uf0c1 The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site) New Map Editor / View \uf0c1 Adding a New Map Editor \uf0c1 This can be done in two ways: using the file menu and right clicking on the tab bar. Using the file menu, simply go to: File > New Map . This opens a new map editor tab with the default projection (CONUS Polar Stereographic). To use the tab bar, right-click on or next to any tab and select New Editor Renaming Map Editor \uf0c1 Any of the map editor tabs can be renamed. This can be particularly helpful if you have multiple tabs, with a different focus on each (ie. different geographic reigon, different types of data, etc). New Projection \uf0c1 A new map projection can be created using the file menu: File > New Projection .","title":"Maps, Views, Projections"},{"location":"cave/maps-views-projections/#maps-views-projections","text":"","title":"Maps, Views, Projections"},{"location":"cave/maps-views-projections/#default-map-scales","text":"The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site)","title":"Default Map Scales"},{"location":"cave/maps-views-projections/#new-map-editor-view","text":"","title":"New Map Editor / View"},{"location":"cave/maps-views-projections/#adding-a-new-map-editor","text":"This can be done in two ways: using the file menu and right clicking on the tab bar. Using the file menu, simply go to: File > New Map . This opens a new map editor tab with the default projection (CONUS Polar Stereographic). To use the tab bar, right-click on or next to any tab and select New Editor","title":"Adding a New Map Editor"},{"location":"cave/maps-views-projections/#renaming-map-editor","text":"Any of the map editor tabs can be renamed. This can be particularly helpful if you have multiple tabs, with a different focus on each (ie. different geographic reigon, different types of data, etc).","title":"Renaming Map Editor"},{"location":"cave/maps-views-projections/#new-projection","text":"A new map projection can be created using the file menu: File > New Projection .","title":"New Projection"},{"location":"cave/ncp-perspective/","text":"The National Centers Perspective (NCP) \uf0c1 The NCP toolbar includes two buttons to load Data and Bundles , respectively. The toolbar also include a Clear button, Zoom and Unzoom , and the NSHARP plugin. Loading Data \uf0c1 Click the \" +Data \" button. Select Category , Source , Group , and Attributes Double-click the product, or select \" Add \" and the data will load to CAVE with the default number of frames (Note: this makes time-matching more difficult. For time-matching multiple products, load as a Bundle .) Latest Available Data Time or Cycle Time is underneath the Attributes column at bottom-right. Create a Bundle \uf0c1 Open the Resource Manager by: Click the \" +Bundle \" button on the toolbar Press the Spacebar Press the \" W \" key Click File -> New -> Bundle . Timeline \uf0c1 A timeline is displayed for available data. Here, the user may choose the dominant resource, number of frames, time range, reference time, etc. for the products to be displayed. Clicking \" Load \" will keep open the Resource Manager while the selected data layers are loaded to the map. \u201d Load and Close \u201d will display data and close the Resource Manager. Save a Bundle \uf0c1 In AWIPS II CAVE, Bundles are organized within the Resource Manager GUI. Steps in the Bundle creation process are prompted with new GUI windows that are specific to the operation taking place, as you will see below. Select resources for a Bundle (as in previous steps). Click the \" Save Bundle \". Select or type-in your desired Group Name and Bundle name and click \" Save Bundle \". After saving a Bundle, its a good idea to confirm that it loads correctly. Select \" Bundle \" -> \u201c Load Bundle \u201d to find your newly created Bundle. The \" Edit Bundle \" button is available to make any changes while loading. Manage Bundles \uf0c1 The third tab in the Resource Manager, titled Manage Bundles can be used to do just that: modify, create, and delete existing Bundle Groups. At the top left, there are 3 options: Modify Bundle Group , Create Bundle Group , and Delete Bundle Group . The user can change the order of the Bundles within the Bundle Group, by clicking the \" Move Up \" and \u201c Move Down \u201d buttons on the right. A user can add Bundles to an existing Bundle Group by clicking the \u201c Add Bundle \u201d button. A new Gui will pop up, allowing the user to select a Bundle that exists within a different Bundle Group or a current CAVE display. A Bundle may be renamed by clicking the \" Rename Bundle \" button. Similarly, an Bundle may be removed from a specific Bundle Group by clicking the \u201c Remove Bundle \u201d button. NOTE: any changes made here must be saved by clicking the \u201c Save Bundle Group \u201d button on the left-hand side. Deleting an Bundle Group is a fairly straightforward action. First, click the \" Delete Bundle Group \" option on the top-left, then select the Bundle Group Group and Name to be deleted. Edit Data Sources \uf0c1 Selection a Resource to edit allows you to update the number of frames, frame span, range and timeline form. The plugin name and grid name ( GDFILE ) can also be edited. Edit Resource Attributes \uf0c1 Using gridded data, selecting an Attribute to edit allows you to change the GEMPAK syntax used to define the resource. Add a New Grid \uf0c1 Click the \" Bundle \" button and then open the \u201c Manage Data \u201d tab. Select the category (we will use GRID ). Select a model to copy as a template. In this example we select the base \" NAM-12km \" model. Click the \" Copy \" button underneath the GRID column. You can edit the new resource under \" Edit Resource Type \". Choose a name for the new resource (e.g. WRF ) In \" Edit Resource Parameters \", change the \u201c GDFILE= \u201d definition to match the name of the new model in the database (In this case we change GDFILE=NAM to GDFILE=WRF) . Click \" Create \" at the bottom of the window to finish. The new Resource now displays with a ( U ) next to the name, signifying a user-created item. In Attribute Groups , you can add attributes to a resource by clicking \" Edit \". Select the desired Attribute Set and click \" Add -> \" to add it to the right column (You can hold the Ctrl key and select multiple Attributes.) Click \" Save \" and then \u201c Ok \u201d. In the \" Create Bundle \" tab, click \u201c New \u201d to see the new Resource. Multi-Pane Display \uf0c1 The NCP includes a configurable multi-pane display. As seen in the figure below, selecting the \"Multi-Pane\" check box extends the GUI window and displays additional options. Selecting the \" Multi-Pane Display \" checkbox enables the multi-pane builder. This new feature allows you to customize the number of panes you would like to display in AWIPS II CAVE. The \"Select Pane\" portion of the GUI allows you to load different products into each pane, which includes importing previously created bundles. Here are a few quick steps to creating a Multi-pane display in AWIPS II: Click the Multi-Pane checkbox in the Resource Manager Select the number of Rows and Columns you would like your data display to contain Select the precise pane in which you would like a specific product (i.e. Row 1, Column1) Choose a product through the Add button (See Data Selection above) Select a different cell in your multi-paned display in which you would like to display a product the user will need to load a separate product from the Resource Manager for each pane in the select pane layout Repeat step #4 Repeat the previous steps, until all of your panes have products queued up inside. Click \" Load \" and your multi-paned display will appear Load Multiple Bundles \uf0c1 The Load Bundle tab in the Resource Manager can be used to load Bundles previously created by the user: The user should select name of the Group in which the desired Bundle is housed. After doing so, a list of available Bundles will appear in the centrally located \"Bundles\" pane. Selecting a Bundle will populate the pane on the right, which displays the contents of each Bundle, and also provides information on its Localization settings. Clicking \"Load\" or \u201cLoad and Close\u201d at the bottom of this window will load the saved Bundle. Before doing so, you can adjust things like Frames, dominant resources, time range, etc. in the \u201c Select Timeline \u201d section at the bottom of the window. Multiple Bundles can be selected and loaded all at once by simply hold the Ctl key and multi-selecting Bundles from the central pane, and then clicking either of the Load buttons. If multiple Bundles are loaded at once, they will each be displayed in different tabs in the CAVE interface. The order/arrangement of the Bundles will be mimicked in the order of the tabs when displayed in CAVE. Finally, the user may also edit an Bundle in this tab, simply by clicking the \" Edit \" button, and making desired changes in the GUI that pops up.","title":"The National Centers Perspective (NCP)"},{"location":"cave/ncp-perspective/#the-national-centers-perspective-ncp","text":"The NCP toolbar includes two buttons to load Data and Bundles , respectively. The toolbar also include a Clear button, Zoom and Unzoom , and the NSHARP plugin.","title":"The National Centers Perspective (NCP)"},{"location":"cave/ncp-perspective/#loading-data","text":"Click the \" +Data \" button. Select Category , Source , Group , and Attributes Double-click the product, or select \" Add \" and the data will load to CAVE with the default number of frames (Note: this makes time-matching more difficult. For time-matching multiple products, load as a Bundle .) Latest Available Data Time or Cycle Time is underneath the Attributes column at bottom-right.","title":"Loading Data"},{"location":"cave/ncp-perspective/#create-a-bundle","text":"Open the Resource Manager by: Click the \" +Bundle \" button on the toolbar Press the Spacebar Press the \" W \" key Click File -> New -> Bundle .","title":"Create a Bundle"},{"location":"cave/ncp-perspective/#timeline","text":"A timeline is displayed for available data. Here, the user may choose the dominant resource, number of frames, time range, reference time, etc. for the products to be displayed. Clicking \" Load \" will keep open the Resource Manager while the selected data layers are loaded to the map. \u201d Load and Close \u201d will display data and close the Resource Manager.","title":"Timeline"},{"location":"cave/ncp-perspective/#save-a-bundle","text":"In AWIPS II CAVE, Bundles are organized within the Resource Manager GUI. Steps in the Bundle creation process are prompted with new GUI windows that are specific to the operation taking place, as you will see below. Select resources for a Bundle (as in previous steps). Click the \" Save Bundle \". Select or type-in your desired Group Name and Bundle name and click \" Save Bundle \". After saving a Bundle, its a good idea to confirm that it loads correctly. Select \" Bundle \" -> \u201c Load Bundle \u201d to find your newly created Bundle. The \" Edit Bundle \" button is available to make any changes while loading.","title":"Save a Bundle"},{"location":"cave/ncp-perspective/#manage-bundles","text":"The third tab in the Resource Manager, titled Manage Bundles can be used to do just that: modify, create, and delete existing Bundle Groups. At the top left, there are 3 options: Modify Bundle Group , Create Bundle Group , and Delete Bundle Group . The user can change the order of the Bundles within the Bundle Group, by clicking the \" Move Up \" and \u201c Move Down \u201d buttons on the right. A user can add Bundles to an existing Bundle Group by clicking the \u201c Add Bundle \u201d button. A new Gui will pop up, allowing the user to select a Bundle that exists within a different Bundle Group or a current CAVE display. A Bundle may be renamed by clicking the \" Rename Bundle \" button. Similarly, an Bundle may be removed from a specific Bundle Group by clicking the \u201c Remove Bundle \u201d button. NOTE: any changes made here must be saved by clicking the \u201c Save Bundle Group \u201d button on the left-hand side. Deleting an Bundle Group is a fairly straightforward action. First, click the \" Delete Bundle Group \" option on the top-left, then select the Bundle Group Group and Name to be deleted.","title":"Manage Bundles"},{"location":"cave/ncp-perspective/#edit-data-sources","text":"Selection a Resource to edit allows you to update the number of frames, frame span, range and timeline form. The plugin name and grid name ( GDFILE ) can also be edited.","title":"Edit Data Sources"},{"location":"cave/ncp-perspective/#edit-resource-attributes","text":"Using gridded data, selecting an Attribute to edit allows you to change the GEMPAK syntax used to define the resource.","title":"Edit Resource Attributes"},{"location":"cave/ncp-perspective/#add-a-new-grid","text":"Click the \" Bundle \" button and then open the \u201c Manage Data \u201d tab. Select the category (we will use GRID ). Select a model to copy as a template. In this example we select the base \" NAM-12km \" model. Click the \" Copy \" button underneath the GRID column. You can edit the new resource under \" Edit Resource Type \". Choose a name for the new resource (e.g. WRF ) In \" Edit Resource Parameters \", change the \u201c GDFILE= \u201d definition to match the name of the new model in the database (In this case we change GDFILE=NAM to GDFILE=WRF) . Click \" Create \" at the bottom of the window to finish. The new Resource now displays with a ( U ) next to the name, signifying a user-created item. In Attribute Groups , you can add attributes to a resource by clicking \" Edit \". Select the desired Attribute Set and click \" Add -> \" to add it to the right column (You can hold the Ctrl key and select multiple Attributes.) Click \" Save \" and then \u201c Ok \u201d. In the \" Create Bundle \" tab, click \u201c New \u201d to see the new Resource.","title":"Add a New Grid"},{"location":"cave/ncp-perspective/#multi-pane-display","text":"The NCP includes a configurable multi-pane display. As seen in the figure below, selecting the \"Multi-Pane\" check box extends the GUI window and displays additional options. Selecting the \" Multi-Pane Display \" checkbox enables the multi-pane builder. This new feature allows you to customize the number of panes you would like to display in AWIPS II CAVE. The \"Select Pane\" portion of the GUI allows you to load different products into each pane, which includes importing previously created bundles. Here are a few quick steps to creating a Multi-pane display in AWIPS II: Click the Multi-Pane checkbox in the Resource Manager Select the number of Rows and Columns you would like your data display to contain Select the precise pane in which you would like a specific product (i.e. Row 1, Column1) Choose a product through the Add button (See Data Selection above) Select a different cell in your multi-paned display in which you would like to display a product the user will need to load a separate product from the Resource Manager for each pane in the select pane layout Repeat step #4 Repeat the previous steps, until all of your panes have products queued up inside. Click \" Load \" and your multi-paned display will appear","title":"Multi-Pane Display"},{"location":"cave/ncp-perspective/#load-multiple-bundles","text":"The Load Bundle tab in the Resource Manager can be used to load Bundles previously created by the user: The user should select name of the Group in which the desired Bundle is housed. After doing so, a list of available Bundles will appear in the centrally located \"Bundles\" pane. Selecting a Bundle will populate the pane on the right, which displays the contents of each Bundle, and also provides information on its Localization settings. Clicking \"Load\" or \u201cLoad and Close\u201d at the bottom of this window will load the saved Bundle. Before doing so, you can adjust things like Frames, dominant resources, time range, etc. in the \u201c Select Timeline \u201d section at the bottom of the window. Multiple Bundles can be selected and loaded all at once by simply hold the Ctl key and multi-selecting Bundles from the central pane, and then clicking either of the Load buttons. If multiple Bundles are loaded at once, they will each be displayed in different tabs in the CAVE interface. The order/arrangement of the Bundles will be mimicked in the order of the tabs when displayed in CAVE. Finally, the user may also edit an Bundle in this tab, simply by clicking the \" Edit \" button, and making desired changes in the GUI that pops up.","title":"Load Multiple Bundles"},{"location":"cave/nsharp/","text":"NSHARP \uf0c1 NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source that has data (signified by a green box to the right) Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display NSHARP Configurations \uf0c1 NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview . Skew-T Display \uf0c1 The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display. Windspeed vs Height and Inferred Temperature Advection \uf0c1 The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information. Hodograph Display \uf0c1 This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display. Insets \uf0c1 In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information. Table Output Displays \uf0c1 The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available. Graphs/Statistics \uf0c1 In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information. Sounding Inventory \uf0c1 This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.","title":"NSHARP"},{"location":"cave/nsharp/#nsharp","text":"NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source that has data (signified by a green box to the right) Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display","title":"NSHARP"},{"location":"cave/nsharp/#nsharp-configurations","text":"NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview .","title":"NSHARP Configurations"},{"location":"cave/nsharp/#skew-t-display","text":"The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display.","title":"Skew-T Display"},{"location":"cave/nsharp/#windspeed-vs-height-and-inferred-temperature-advection","text":"The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information.","title":"Windspeed vs Height and Inferred Temperature Advection"},{"location":"cave/nsharp/#hodograph-display","text":"This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display.","title":"Hodograph Display"},{"location":"cave/nsharp/#insets","text":"In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information.","title":"Insets"},{"location":"cave/nsharp/#table-output-displays","text":"The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available.","title":"Table Output Displays"},{"location":"cave/nsharp/#graphsstatistics","text":"In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information.","title":"Graphs/Statistics"},{"location":"cave/nsharp/#sounding-inventory","text":"This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.","title":"Sounding Inventory"},{"location":"cave/pgen/","text":"Product Generator (PGEN) \uf0c1 The National Centers use PGEN to draw annotations and generate all their products, and it is included in D-2D to support Center Weather Service Units (CWSUs) making AWC-style SIGMETs. While this is not intended to be used for other purposes, there are a number of unique drawing and annotation tools that can be used to make images using the CAVE->export->Image once a display has been created.","title":"Product Generator (PGEN)"},{"location":"cave/pgen/#product-generator-pgen","text":"The National Centers use PGEN to draw annotations and generate all their products, and it is included in D-2D to support Center Weather Service Units (CWSUs) making AWC-style SIGMETs. While this is not intended to be used for other purposes, there are a number of unique drawing and annotation tools that can be used to make images using the CAVE->export->Image once a display has been created.","title":"Product Generator (PGEN)"},{"location":"cave/unused-components/","text":"An overview of components that are used operationally by the NWS but are made inactive in the Unidata release. Some components are impractical for non-operational use, and some are unavailable for distribution outside of the NWS. Data Delivery \uf0c1 The \"Data Delivery\" option opens the Data Delivery application. Data Delivery is a permission-based application, meaning that the System Manager or User Administrator controls the user's access to the Data Delivery functionalities. If granted permission to access this application, Data Delivery allows a user to subscribe to a data source or create an ad hoc request and have the data delivered in near real time. Whether delivered by subscription or in response to an ad hoc request, the data can be tailored to a user's specific temporal, geographic, and parameter needs. For a detailed description of the Data Delivery application, refer to Section 16.1. Collaboration \uf0c1 The \"Collaboration\" option offers two main functions: chatting and sharing displays. Chat allows users to send and receive instant messages or chat with fellow forecasters and offices in a chat room. Sharing displays adds to the chat room capabilities and allows the room's creator to show a CAVE map display to other participants in the room. For a detailed description of \"Collaboration\" and information on how to create a chat session and share displays, refer to Section 16.2. Archive Case Creation \uf0c1 The \"Archive Case Creation\" option is a component of the AWIPS-2 Archiver application. The archiver application is a permission-based functionality. It allows a user to extract stored weather event data and copy it into a user-defined directory to be archived (e.g., burned to a DVD). The archived data can later be played back for simulation of weather events using the WES-2 Bridge. For a detailed description of the AWIPS-2 Archiver application and the \"Archive Case Creation\" component, refer to Section 16.3. Archive Retention: The \"Archive Retention\" option is a component of the AWIPS-2 Archiver application. The archiver retention functionality and its purge component, which runs on EDEX, are permission-based functionalities. Access to the \"Archive Retention\" option is limited to User Administrators and users identified as a database/purge focal point. More information on these AWIPS-2 Archiver application functionalities are provided in the System Manager's Manual. AWIPS User Administration \uf0c1 Some of the functionalities of certain CAVE applications (currently, Data Delivery and Localization) are reserved for designated users. User Administrators choose the \"AWIPS User Administration\" option to access the screens they use to set permissions and roles for the reserved functions. Access to the \"AWIPS User Administration\" option is limited to User Administrators. Other users who select this option will be denied access and receive the Alert Message shown in Exhibit 2.2.6.1-5. More information on AWIPS User Administration is provided in the System Manager's Manual. LDAD (Local Data Acquisition and Dissemination) \uf0c1 The LDAD system provides the means to acquire local data sets, perform quality control on the incoming data, and disseminate weather data to the external user community. It contains a number of components that reside on the internal AWIPS network and on the external LDAD component (on the LDAD server cluster). The internal and external components at WFOs are separated by a security firewall. The basic LDAD concept simplifies this process for both the data providers and for the support team. LDAD uses a simple data format, ASCII Comma Separated Values Text (CSVText), which separates each data field by a comma. A set of metadata files, created and maintained by the data provider or in conjunction with site personnel, will be used by the acquisition decoder. This facilitates data processing in hydrometeorological units instead of sensor units and removes the need for conversion routines. The simplicity of the CSVText format increases the likelihood that the data provider will use this standardized format. All LDAD acquisition data are categorized and stored into the following four classes: * Mesonet for surface weather observations * Hydro for rain and stream observations * Manual for manual observations such as cooperative observers * Upper air for multilevel observations such as profilers. The LDAD functionality supports the acquisition of the Integrated Flood Observing and Warning System (IFLOWS); ALERT; Mesonet; Profiler; RRS/Upper Air; Gauges (LARC, Handar, Campbell, Sutron); COOP (Co-operative Observations); and other data transported via LDM, Rsync, or other TCP/IP Protocols. The Data Acquisition function is achieved when data is transmitted to the internal (trusted) AWIPS servers. The data is transmitted to and from the LDAD Cluster via TCP/IP protocols or RS-232 communications. The Data Dissemination function is achieved when data is transmitted to the LDAD Cluster from the internal AWIPS system and is then distributed to External Users. The data can be acquired, stored, and displayed once fully configured. The LDAD System consists of two LDAD servers (LS2/3), a LAN switch (SMC 8024), a Terminal Server (Cyclades ACS32), Modems (MultiTech MT5600BR), and a LAN DMZ (HP ProCurve 2824). The DMZ consists of two SSG 320M Firewalls, a Netgear 16 switch, and two Netgear 5 port hubs. The LDAD baseline processes run on the LDAD Cluster (DMZ) and the AWIPS PX Cluster (Internal). Other local applications may run on other internal clusters, such as DX cluster in the case of the LDAD Dissemination Server. Data is transmitted through the DMZ either to the Trusted (internal) AWIPS system or to the Untrusted (External) Users/Systems.","title":"Unused components"},{"location":"cave/unused-components/#data-delivery","text":"The \"Data Delivery\" option opens the Data Delivery application. Data Delivery is a permission-based application, meaning that the System Manager or User Administrator controls the user's access to the Data Delivery functionalities. If granted permission to access this application, Data Delivery allows a user to subscribe to a data source or create an ad hoc request and have the data delivered in near real time. Whether delivered by subscription or in response to an ad hoc request, the data can be tailored to a user's specific temporal, geographic, and parameter needs. For a detailed description of the Data Delivery application, refer to Section 16.1.","title":"Data Delivery"},{"location":"cave/unused-components/#collaboration","text":"The \"Collaboration\" option offers two main functions: chatting and sharing displays. Chat allows users to send and receive instant messages or chat with fellow forecasters and offices in a chat room. Sharing displays adds to the chat room capabilities and allows the room's creator to show a CAVE map display to other participants in the room. For a detailed description of \"Collaboration\" and information on how to create a chat session and share displays, refer to Section 16.2.","title":"Collaboration"},{"location":"cave/unused-components/#archive-case-creation","text":"The \"Archive Case Creation\" option is a component of the AWIPS-2 Archiver application. The archiver application is a permission-based functionality. It allows a user to extract stored weather event data and copy it into a user-defined directory to be archived (e.g., burned to a DVD). The archived data can later be played back for simulation of weather events using the WES-2 Bridge. For a detailed description of the AWIPS-2 Archiver application and the \"Archive Case Creation\" component, refer to Section 16.3. Archive Retention: The \"Archive Retention\" option is a component of the AWIPS-2 Archiver application. The archiver retention functionality and its purge component, which runs on EDEX, are permission-based functionalities. Access to the \"Archive Retention\" option is limited to User Administrators and users identified as a database/purge focal point. More information on these AWIPS-2 Archiver application functionalities are provided in the System Manager's Manual.","title":"Archive Case Creation"},{"location":"cave/unused-components/#awips-user-administration","text":"Some of the functionalities of certain CAVE applications (currently, Data Delivery and Localization) are reserved for designated users. User Administrators choose the \"AWIPS User Administration\" option to access the screens they use to set permissions and roles for the reserved functions. Access to the \"AWIPS User Administration\" option is limited to User Administrators. Other users who select this option will be denied access and receive the Alert Message shown in Exhibit 2.2.6.1-5. More information on AWIPS User Administration is provided in the System Manager's Manual.","title":"AWIPS User Administration"},{"location":"cave/unused-components/#ldad-local-data-acquisition-and-dissemination","text":"The LDAD system provides the means to acquire local data sets, perform quality control on the incoming data, and disseminate weather data to the external user community. It contains a number of components that reside on the internal AWIPS network and on the external LDAD component (on the LDAD server cluster). The internal and external components at WFOs are separated by a security firewall. The basic LDAD concept simplifies this process for both the data providers and for the support team. LDAD uses a simple data format, ASCII Comma Separated Values Text (CSVText), which separates each data field by a comma. A set of metadata files, created and maintained by the data provider or in conjunction with site personnel, will be used by the acquisition decoder. This facilitates data processing in hydrometeorological units instead of sensor units and removes the need for conversion routines. The simplicity of the CSVText format increases the likelihood that the data provider will use this standardized format. All LDAD acquisition data are categorized and stored into the following four classes: * Mesonet for surface weather observations * Hydro for rain and stream observations * Manual for manual observations such as cooperative observers * Upper air for multilevel observations such as profilers. The LDAD functionality supports the acquisition of the Integrated Flood Observing and Warning System (IFLOWS); ALERT; Mesonet; Profiler; RRS/Upper Air; Gauges (LARC, Handar, Campbell, Sutron); COOP (Co-operative Observations); and other data transported via LDM, Rsync, or other TCP/IP Protocols. The Data Acquisition function is achieved when data is transmitted to the internal (trusted) AWIPS servers. The data is transmitted to and from the LDAD Cluster via TCP/IP protocols or RS-232 communications. The Data Dissemination function is achieved when data is transmitted to the LDAD Cluster from the internal AWIPS system and is then distributed to External Users. The data can be acquired, stored, and displayed once fully configured. The LDAD System consists of two LDAD servers (LS2/3), a LAN switch (SMC 8024), a Terminal Server (Cyclades ACS32), Modems (MultiTech MT5600BR), and a LAN DMZ (HP ProCurve 2824). The DMZ consists of two SSG 320M Firewalls, a Netgear 16 switch, and two Netgear 5 port hubs. The LDAD baseline processes run on the LDAD Cluster (DMZ) and the AWIPS PX Cluster (Internal). Other local applications may run on other internal clusters, such as DX cluster in the case of the LDAD Dissemination Server. Data is transmitted through the DMZ either to the Trusted (internal) AWIPS system or to the Untrusted (External) Users/Systems.","title":"LDAD (Local Data Acquisition and Dissemination)"},{"location":"cave/warngen/","text":"WarnGen Walkthrough \uf0c1 WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages) Quick Steps - Using WarnGen in Unidata AWIPS CAVE \uf0c1 Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (PAH is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kpah > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Note: Since you are not \"issuing\" the warning, leave the top to rows blank (\"TTAAii\" and \"CCCC\") and Click \"Enter\" and a separate text window should open Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon Video - Using WarnGen in AWIPS \uf0c1 The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video. Load NEXRAD level 3 display \uf0c1 Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (PAH Paducah, Kentucky, in this example). Select SITE Localization \uf0c1 Open CAVE > Preferences > Localization , select the CWA site ID (PAH) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window. Load single radar data from the local radars \uf0c1 Click on the local radar kpah > Z + SRM8 . Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM. Launch WarnGen \uf0c1 Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window. Generate a Storm Motion Vector \uf0c1 Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits. Moving Vertex Points \uf0c1 Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream. Add and Remove Vertex Points \uf0c1 There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex . Redrawing a Polygon \uf0c1 Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon. Text Window \uf0c1 Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Leave the top two rows bank and click Enter for the text window to open. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.","title":"WarnGen Walkthrough"},{"location":"cave/warngen/#warngen-walkthrough","text":"WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages)","title":"WarnGen Walkthrough"},{"location":"cave/warngen/#quick-steps-using-warngen-in-unidata-awips-cave","text":"Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (PAH is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kpah > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Note: Since you are not \"issuing\" the warning, leave the top to rows blank (\"TTAAii\" and \"CCCC\") and Click \"Enter\" and a separate text window should open Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon","title":"Quick Steps - Using WarnGen in Unidata AWIPS CAVE"},{"location":"cave/warngen/#video-using-warngen-in-awips","text":"The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video.","title":"Video - Using WarnGen in AWIPS"},{"location":"cave/warngen/#load-nexrad-level-3-display","text":"Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (PAH Paducah, Kentucky, in this example).","title":"Load NEXRAD level 3 display"},{"location":"cave/warngen/#select-site-localization","text":"Open CAVE > Preferences > Localization , select the CWA site ID (PAH) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window.","title":"Select SITE Localization"},{"location":"cave/warngen/#load-single-radar-data-from-the-local-radars","text":"Click on the local radar kpah > Z + SRM8 . Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM.","title":"Load single radar data from the local radars"},{"location":"cave/warngen/#launch-warngen","text":"Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window.","title":"Launch WarnGen"},{"location":"cave/warngen/#generate-a-storm-motion-vector","text":"Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits.","title":"Generate a Storm Motion Vector"},{"location":"cave/warngen/#moving-vertex-points","text":"Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream.","title":"Moving Vertex Points"},{"location":"cave/warngen/#add-and-remove-vertex-points","text":"There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex .","title":"Add and Remove Vertex Points"},{"location":"cave/warngen/#redrawing-a-polygon","text":"Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon.","title":"Redrawing a Polygon"},{"location":"cave/warngen/#text-window","text":"Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Leave the top two rows bank and click Enter for the text window to open. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.","title":"Text Window"},{"location":"dev/awips-development-environment/","text":"AWIPS Development Environment (ADE) \uf0c1 Detailed instructions on how to download the latest source code and run CAVE from Eclipse. It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS. The following yum commands listed in these instructions may need to be run as the root user, but the rest of the commands should be run as the local user. 1. Remove AWIPS Instances \uf0c1 First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all yum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2 2. Set Up AWIPS Repo \uf0c1 Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: sudo vi /etc/yum.repos.d/awips2.repo [awips2repo] name=AWIPS II Repository baseurl=https://downloads.unidata.ucar.edu/awips2/current/linux/rpms/ el7-dev/ enabled=1 protect=0 gpgcheck=0 proxy=_none_ This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl. 3. Install the ADE \uf0c1 Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all yum groupinstall awips2-ade Check the libGLU package is installed by running rpm -qa | grep mesa-libGLU . If nothing is returned, install the package via: yum install mesa-libGLU . 4. Download the Source Code \uf0c1 If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git git clone https://github.com/Unidata/awips2-cimss.git git clone https://github.com/Unidata/awips2-core.git git clone https://github.com/Unidata/awips2-core-foss.git git clone https://github.com/Unidata/awips2-drawing.git git clone https://github.com/Unidata/awips2-foss.git git clone https://github.com/Unidata/awips2-goesr.git git clone https://github.com/Unidata/awips2-gsd.git git clone https://github.com/Unidata/awips2-ncep.git git clone https://github.com/Unidata/awips2-nws.git Make sure to run git checkout in each repo if you'd wish to develop from a branch different from the default. It's best to do this before importing the repos into eclipse. 5. Configure Eclipse \uf0c1 Open eclipse by running: /awips2/eclipse/eclipse It is fine to choose the default workspace upon starting up. Set Preferences \uf0c1 Verify or make the following changes to set up eclipse for AWIPS development: Window > Preferences > Java > Installed JREs Set to /awips2/java Window > Preferences > PyDev > Interpreters > Python Interpreter Set to /awips2/python/bin/python Note: Add all paths to the SYSTEM pythonpath if prompted There might be some unresolved errors. These should be made to warnings instead. Window > Preferences > Java > Compiler > Building > Build path Problems > Circular Dependencies > Change to Warning Window > Preferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\" Importing Git Repos \uf0c1 All of the git repos that were cloned in the previous step will need to be imported into Eclipse. But, be aware the awips2 repo is done last, because it requires different steps. File > Import > Git > Projects from Git > Next Continue with the default selection, Existing local repository > Add.. > add each of the git repos (for example .../awips2-core ) > check the checkbox > Finish Then for each of the repos (except awips2 right now): Select the repo name > Next > Continue with default selection (Working Tree) > Next > Continue with default selections (all choices selected) > Finish Finally, for awips2 repo, follow all the above steps except in the Working Tree, only select: cave > Next > Finish edexOsgi > Next > Finish Final Setup \uf0c1 Project > Clean > OK Use default selections: Clean all projects , Start a build immediately , Build the entire workspace Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\" 6. Run CAVE \uf0c1 CAVE can be ran from eclipse by using the com.raytheon.viz.product.awips/developer.product Double-click the developer.product file to open the Project Explorer in Eclipse. Select Overview > Synchronize Use the Project Explorer on the left-hand side of eclipse to run CAVE as a Java application or in Debug mode : Run Application \uf0c1 Select Run As > Eclipse Application Debug Application \uf0c1 Select Debug > Eclipse Application Troubleshooting \uf0c1 If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild. Window > Preferences > Java > Compiler > Compiler compliance level setting","title":"Development"},{"location":"dev/awips-development-environment/#awips-development-environment-ade","text":"Detailed instructions on how to download the latest source code and run CAVE from Eclipse. It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS. The following yum commands listed in these instructions may need to be run as the root user, but the rest of the commands should be run as the local user.","title":"AWIPS Development Environment (ADE)"},{"location":"dev/awips-development-environment/#1-remove-awips-instances","text":"First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all yum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2","title":"1. Remove AWIPS Instances"},{"location":"dev/awips-development-environment/#2-set-up-awips-repo","text":"Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: sudo vi /etc/yum.repos.d/awips2.repo [awips2repo] name=AWIPS II Repository baseurl=https://downloads.unidata.ucar.edu/awips2/current/linux/rpms/ el7-dev/ enabled=1 protect=0 gpgcheck=0 proxy=_none_ This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl.","title":"2. Set Up AWIPS Repo"},{"location":"dev/awips-development-environment/#3-install-the-ade","text":"Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all yum groupinstall awips2-ade Check the libGLU package is installed by running rpm -qa | grep mesa-libGLU . If nothing is returned, install the package via: yum install mesa-libGLU .","title":"3. Install the ADE"},{"location":"dev/awips-development-environment/#4-download-the-source-code","text":"If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git git clone https://github.com/Unidata/awips2-cimss.git git clone https://github.com/Unidata/awips2-core.git git clone https://github.com/Unidata/awips2-core-foss.git git clone https://github.com/Unidata/awips2-drawing.git git clone https://github.com/Unidata/awips2-foss.git git clone https://github.com/Unidata/awips2-goesr.git git clone https://github.com/Unidata/awips2-gsd.git git clone https://github.com/Unidata/awips2-ncep.git git clone https://github.com/Unidata/awips2-nws.git Make sure to run git checkout in each repo if you'd wish to develop from a branch different from the default. It's best to do this before importing the repos into eclipse.","title":"4. Download the Source Code"},{"location":"dev/awips-development-environment/#5-configure-eclipse","text":"Open eclipse by running: /awips2/eclipse/eclipse It is fine to choose the default workspace upon starting up.","title":"5. Configure Eclipse"},{"location":"dev/awips-development-environment/#set-preferences","text":"Verify or make the following changes to set up eclipse for AWIPS development: Window > Preferences > Java > Installed JREs Set to /awips2/java Window > Preferences > PyDev > Interpreters > Python Interpreter Set to /awips2/python/bin/python Note: Add all paths to the SYSTEM pythonpath if prompted There might be some unresolved errors. These should be made to warnings instead. Window > Preferences > Java > Compiler > Building > Build path Problems > Circular Dependencies > Change to Warning Window > Preferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\"","title":"Set Preferences"},{"location":"dev/awips-development-environment/#importing-git-repos","text":"All of the git repos that were cloned in the previous step will need to be imported into Eclipse. But, be aware the awips2 repo is done last, because it requires different steps. File > Import > Git > Projects from Git > Next Continue with the default selection, Existing local repository > Add.. > add each of the git repos (for example .../awips2-core ) > check the checkbox > Finish Then for each of the repos (except awips2 right now): Select the repo name > Next > Continue with default selection (Working Tree) > Next > Continue with default selections (all choices selected) > Finish Finally, for awips2 repo, follow all the above steps except in the Working Tree, only select: cave > Next > Finish edexOsgi > Next > Finish","title":"Importing Git Repos"},{"location":"dev/awips-development-environment/#final-setup","text":"Project > Clean > OK Use default selections: Clean all projects , Start a build immediately , Build the entire workspace Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\"","title":"Final Setup"},{"location":"dev/awips-development-environment/#6-run-cave","text":"CAVE can be ran from eclipse by using the com.raytheon.viz.product.awips/developer.product Double-click the developer.product file to open the Project Explorer in Eclipse. Select Overview > Synchronize Use the Project Explorer on the left-hand side of eclipse to run CAVE as a Java application or in Debug mode :","title":"6. Run CAVE"},{"location":"dev/awips-development-environment/#run-application","text":"Select Run As > Eclipse Application","title":"Run Application"},{"location":"dev/awips-development-environment/#debug-application","text":"Select Debug > Eclipse Application","title":"Debug Application"},{"location":"dev/awips-development-environment/#troubleshooting","text":"If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild. Window > Preferences > Java > Compiler > Compiler compliance level setting","title":"Troubleshooting"},{"location":"dev/build-datadelivery/","text":"Data Delivery has been implemented into the AWIPS(II) baseline to provide access to data that is not resident locally at a Weather Forecast Office, River Forecast Center, or National Center. Data Delivery gives users the ability to create queries (One Time Requests) and subscriptions to data sets (provided OGC / OpenDAP servers such as THREDDS). build.edex/build.xml \uf0c1 ... Notice the last two commented out, com.raytheon.uf.edex.datadelivery.feature and com.raytheon.uf.edex.ogc.feature . These feature sets do not exist , but could easily be created in the same wat as other features (like com.raytheon.uf.common.base.feature , com.raytheon.uf.edex.base.feature , etc. wa-build \uf0c1 The source code comments provide the following guidance: In the work assignment's edexOsgi/build.edex directory, create a file named similar to the following: edexOsgi/build.edex/5-Data_Delivery-wa-build.properties In the file, there should be one line such as: wa.features=feature1,feature2 However, the wa-build Ant target requires a file features.txt exist. So if is 5-Data_Delivery-wa-build.properties or features.txt ? Because the delimiter being specified is a line separator (and not a comma \"wa.features=feature1,feature2\" as with versions proir to 16.2.2). So we can infer that a file should exist called features.txt should exist which has one WA feature per line. And what do you know, a similar file exist for the CAVE build in awips2-builds/cave/build/features.txt : cat awips2-builds/cave/build/features.txt com.raytheon.uf.common.base.feature com.raytheon.uf.viz.dataplugin.obs.feature ... ","title":"Build datadelivery"},{"location":"dev/build-datadelivery/#buildedexbuildxml","text":" ... Notice the last two commented out, com.raytheon.uf.edex.datadelivery.feature and com.raytheon.uf.edex.ogc.feature . These feature sets do not exist , but could easily be created in the same wat as other features (like com.raytheon.uf.common.base.feature , com.raytheon.uf.edex.base.feature , etc.","title":"build.edex/build.xml"},{"location":"dev/build-datadelivery/#wa-build","text":"The source code comments provide the following guidance: In the work assignment's edexOsgi/build.edex directory, create a file named similar to the following: edexOsgi/build.edex/5-Data_Delivery-wa-build.properties In the file, there should be one line such as: wa.features=feature1,feature2 However, the wa-build Ant target requires a file features.txt exist. So if is 5-Data_Delivery-wa-build.properties or features.txt ? Because the delimiter being specified is a line separator (and not a comma \"wa.features=feature1,feature2\" as with versions proir to 16.2.2). So we can infer that a file should exist called features.txt should exist which has one WA feature per line. And what do you know, a similar file exist for the CAVE build in awips2-builds/cave/build/features.txt : cat awips2-builds/cave/build/features.txt com.raytheon.uf.common.base.feature com.raytheon.uf.viz.dataplugin.obs.feature ... ","title":"wa-build"},{"location":"devguide/data-flow/","text":"Data Receipt \uf0c1 The LDM obtains a data product from an upstream LDM site on the IDD. The LDM writes the data to file in Raw Data Storage. The LDM uses edexBridge to post a \u201cdata available\u201d message to the Qpid message broker. The EDEX Ingest process obtains the \u201cdata available\u201d message from Qpid and removes the message from the message queue. The EDEX Ingest process obtains the data files from Raw Data Storage. This architecture provides separation between data sources and ingest processing. Any data source, not just the LDM/IDD, can follow this architecture to provide data for EDEX to process. Data Decoding \uf0c1 Data decoding is defined as the process of converting data from a raw format into a decoded format that is usable by CAVE. In AWIPS, data decoding is performed by the EDEX Ingest proessing ( ingest and ingestGrib ). EDEX Ingest obtains the \u201cdata available\u201d message from the Qpid message broker, and determines the appropriate data decoder based on the message contents. EDEX Ingest then forwards the message to the chosen decoder. Finally, the message is removed from the message queue. EDEX Ingest reads the data from Raw Data Storage. EDEX Ingest decodes the data. EDEX Ingest forwards the decoded data to Processed Data Storage. EDEX Ingest sends a message to the CAVE client indicating that newly-decoded data is available. It is important to note that in AWIPS all data types are processed by either the standard ingest process, or by the ingestGrib process, which handles all grib message ingestion. Once this data decoding process is complete, clients may obtain and perform additional processing on the data, including displaying data in CAVE. Processed Data Storage Architecture \uf0c1 Processed Data Storage refers to the persistence of decoded data and occurs in two separate forms: 1) metadata and some decoded data, which is stored in Postgres database tables; and 2) imagery data, gridded forecast data, and certain point data, which is stored in HDF5 files, and is managed by PyPIES. Writing to Processed Data Storage involves the following: 1) The EDEX Process sends serialized data, area data, and certain point data to PyPIES. 2) PyPIES writes the data to the HDF5 data store. 3) EDEX send the metadata to the Postgres DBMS 4) Postgres writes the metadata to the AWIPS database. For data not stored in HDF5, Steps 1 and 2 are skipped. For processed data retrieval, the process is revered: 3) EDEX requests the metadata from Postgres. 4) Postgres reads the AWIPS database and returns the requested metadata to EDEX. 1) EDEX sends a data request to PyPIES. 2) PyPIES reads the data from the HDF5 data store and returns it to EDEX. In this case, if the data is not stored in HDF5, then Steps 3 and 4 are skipped. Data Retrieval Architecture \uf0c1 Data retrieval is the process by which the CAVE client obtains data using the EDEX Request server; the Request server obtains the data from processed data storage (Postgres and HDF5) and returns it to CAVE. CAVE sends a request via TCP to the EDEX Request server. EDEX Request server obtains the requested metadata via Postgres and stored data via PyPIES. EDEX Request forwards the requested data directly back to the CAVE client. For clustered EDEX servers using IPVS, this architecture allows CAVE clients to access any available EDEX Request process, providing an improvement in system reliability and speed. Data retrieval from processed data storage follows the same pattern as data storage: retrieval of HDF5 is handled by PyPIES; retrieval of database data is handled by Postgres. Data Purge Architecture \uf0c1 Raw data storage and processed data storage use two different purge mechanisms. For processed data storage, AWIPS implements a plugin based purge strategy, where the user has the option to change the purge frequency for each plugin individually. Raw Data Purge \uf0c1 Purging of Raw Data Storage is managed by the LDM user account cron, which executes the ldmadmin scour process, removing data files using an age-based strategy. The directories and retention times for raw data storage are controlled by scour.conf , which is located in the ldm user's ~/etc/ directory. Each entry in scour.conf contains the directory to manage, the retention time and an optional file name pattern for data files. An ldm user cron job executes ldmadmin. ldmadmin executes the LDM scour program. The LDM scour program deletes outdated data from AWIPS Raw Data Storage. Processed Data Purge \uf0c1 Rules for this version-based purge are contained in XML files located in /awips2/edex/data/utility/ . The purge is triggered by a quartz timer event that fires at 30 minutes after each hour. A Quartz event is triggered in the EDEX Ingest process causing the Purge Service to obtain a purge lock. If the lock is already taken, the Purge Service will exit, ensuring that only a single EDEX Ingest process performs the purge. The EDEX Purge Service sends a delete message to Postgres. Postgres deletes the specified data from the database. If HDF5 data is to be purged, the Purge Service messages PyPIES. PyPIES deletes the specified HDF5 files.","title":"Data flow"},{"location":"devguide/data-flow/#data-receipt","text":"The LDM obtains a data product from an upstream LDM site on the IDD. The LDM writes the data to file in Raw Data Storage. The LDM uses edexBridge to post a \u201cdata available\u201d message to the Qpid message broker. The EDEX Ingest process obtains the \u201cdata available\u201d message from Qpid and removes the message from the message queue. The EDEX Ingest process obtains the data files from Raw Data Storage. This architecture provides separation between data sources and ingest processing. Any data source, not just the LDM/IDD, can follow this architecture to provide data for EDEX to process.","title":"Data Receipt"},{"location":"devguide/data-flow/#data-decoding","text":"Data decoding is defined as the process of converting data from a raw format into a decoded format that is usable by CAVE. In AWIPS, data decoding is performed by the EDEX Ingest proessing ( ingest and ingestGrib ). EDEX Ingest obtains the \u201cdata available\u201d message from the Qpid message broker, and determines the appropriate data decoder based on the message contents. EDEX Ingest then forwards the message to the chosen decoder. Finally, the message is removed from the message queue. EDEX Ingest reads the data from Raw Data Storage. EDEX Ingest decodes the data. EDEX Ingest forwards the decoded data to Processed Data Storage. EDEX Ingest sends a message to the CAVE client indicating that newly-decoded data is available. It is important to note that in AWIPS all data types are processed by either the standard ingest process, or by the ingestGrib process, which handles all grib message ingestion. Once this data decoding process is complete, clients may obtain and perform additional processing on the data, including displaying data in CAVE.","title":"Data Decoding"},{"location":"devguide/data-flow/#processed-data-storage-architecture","text":"Processed Data Storage refers to the persistence of decoded data and occurs in two separate forms: 1) metadata and some decoded data, which is stored in Postgres database tables; and 2) imagery data, gridded forecast data, and certain point data, which is stored in HDF5 files, and is managed by PyPIES. Writing to Processed Data Storage involves the following: 1) The EDEX Process sends serialized data, area data, and certain point data to PyPIES. 2) PyPIES writes the data to the HDF5 data store. 3) EDEX send the metadata to the Postgres DBMS 4) Postgres writes the metadata to the AWIPS database. For data not stored in HDF5, Steps 1 and 2 are skipped. For processed data retrieval, the process is revered: 3) EDEX requests the metadata from Postgres. 4) Postgres reads the AWIPS database and returns the requested metadata to EDEX. 1) EDEX sends a data request to PyPIES. 2) PyPIES reads the data from the HDF5 data store and returns it to EDEX. In this case, if the data is not stored in HDF5, then Steps 3 and 4 are skipped.","title":"Processed Data Storage Architecture"},{"location":"devguide/data-flow/#data-retrieval-architecture","text":"Data retrieval is the process by which the CAVE client obtains data using the EDEX Request server; the Request server obtains the data from processed data storage (Postgres and HDF5) and returns it to CAVE. CAVE sends a request via TCP to the EDEX Request server. EDEX Request server obtains the requested metadata via Postgres and stored data via PyPIES. EDEX Request forwards the requested data directly back to the CAVE client. For clustered EDEX servers using IPVS, this architecture allows CAVE clients to access any available EDEX Request process, providing an improvement in system reliability and speed. Data retrieval from processed data storage follows the same pattern as data storage: retrieval of HDF5 is handled by PyPIES; retrieval of database data is handled by Postgres.","title":"Data Retrieval Architecture"},{"location":"devguide/data-flow/#data-purge-architecture","text":"Raw data storage and processed data storage use two different purge mechanisms. For processed data storage, AWIPS implements a plugin based purge strategy, where the user has the option to change the purge frequency for each plugin individually.","title":"Data Purge Architecture"},{"location":"devguide/data-flow/#raw-data-purge","text":"Purging of Raw Data Storage is managed by the LDM user account cron, which executes the ldmadmin scour process, removing data files using an age-based strategy. The directories and retention times for raw data storage are controlled by scour.conf , which is located in the ldm user's ~/etc/ directory. Each entry in scour.conf contains the directory to manage, the retention time and an optional file name pattern for data files. An ldm user cron job executes ldmadmin. ldmadmin executes the LDM scour program. The LDM scour program deletes outdated data from AWIPS Raw Data Storage.","title":"Raw Data Purge"},{"location":"devguide/data-flow/#processed-data-purge","text":"Rules for this version-based purge are contained in XML files located in /awips2/edex/data/utility/ . The purge is triggered by a quartz timer event that fires at 30 minutes after each hour. A Quartz event is triggered in the EDEX Ingest process causing the Purge Service to obtain a purge lock. If the lock is already taken, the Purge Service will exit, ensuring that only a single EDEX Ingest process performs the purge. The EDEX Purge Service sends a delete message to Postgres. Postgres deletes the specified data from the database. If HDF5 data is to be purged, the Purge Service messages PyPIES. PyPIES deletes the specified HDF5 files.","title":"Processed Data Purge"},{"location":"devguide/file-system/","text":"The major file systems on the Linux-OS EDEX Data Server are as follows: Linux File Systems \uf0c1 root ( / ), /tmp , /usr , /var . Linux mandates that these file systems exist. /boot . This file system contains the Linux kernel and boot-up instructions. /home . This file system contains all the user working areas. /dev/shm . This file system is the Linux shared memory. /etc/init.d . Location of startup services ( edex_postgres , httpd-pypies , qpidd , edex_camel ). AWIPS File Systems \uf0c1 /awips2 . This file system is used to store baselined AWIPS software. /awips2/database/data . Database files. /awips2/edex/data/hdf5 . Contains the HDF5 component of the data store and shared static data and hydro apps. /awips2/GFESuite . Contains scripts and data relating to inter site coordination (ISC) and service backup. /awips2/edex/data/utility . Contains localization store and EDEX configuration files. /awips2/httpd_pypies/etc/https/conf . Location of PyPIES Apache server configuration file httpd.conf . /awips2/qpid/etc . Location of Qpid configuration file qpidd.conf . /awips2/qpid/sbin . Location of qpidd executable and queueCreator.sh , which is called by /etc/init.d/qpidd . /awips2/ldm . LDM account home directory. /awips2/ldm/etc . Location of ldmd.conf and pqact.conf . /awips2/ldm/logs . Location of LDM logs. Raw Data Store File System \uf0c1 Folders are usually laid out exactly like the sbn folders on the EDEX server with each plug-in having a folder on the data store. But some of them do not follow the same convention, for e.g., data sent to the 'metar' endpoint will be found in the /data_store/text folder. Additionally, if ingest of a new format is being worked on, you will find these new data types not yet found on the development or integration systems, located in /data_store/experimental .","title":"File system"},{"location":"devguide/file-system/#linux-file-systems","text":"root ( / ), /tmp , /usr , /var . Linux mandates that these file systems exist. /boot . This file system contains the Linux kernel and boot-up instructions. /home . This file system contains all the user working areas. /dev/shm . This file system is the Linux shared memory. /etc/init.d . Location of startup services ( edex_postgres , httpd-pypies , qpidd , edex_camel ).","title":"Linux File Systems"},{"location":"devguide/file-system/#awips-file-systems","text":"/awips2 . This file system is used to store baselined AWIPS software. /awips2/database/data . Database files. /awips2/edex/data/hdf5 . Contains the HDF5 component of the data store and shared static data and hydro apps. /awips2/GFESuite . Contains scripts and data relating to inter site coordination (ISC) and service backup. /awips2/edex/data/utility . Contains localization store and EDEX configuration files. /awips2/httpd_pypies/etc/https/conf . Location of PyPIES Apache server configuration file httpd.conf . /awips2/qpid/etc . Location of Qpid configuration file qpidd.conf . /awips2/qpid/sbin . Location of qpidd executable and queueCreator.sh , which is called by /etc/init.d/qpidd . /awips2/ldm . LDM account home directory. /awips2/ldm/etc . Location of ldmd.conf and pqact.conf . /awips2/ldm/logs . Location of LDM logs.","title":"AWIPS File Systems"},{"location":"devguide/file-system/#raw-data-store-file-system","text":"Folders are usually laid out exactly like the sbn folders on the EDEX server with each plug-in having a folder on the data store. But some of them do not follow the same convention, for e.g., data sent to the 'metar' endpoint will be found in the /data_store/text folder. Additionally, if ingest of a new format is being worked on, you will find these new data types not yet found on the development or integration systems, located in /data_store/experimental .","title":"Raw Data Store File System"},{"location":"devguide/linux-tools/","text":"Several standard Linux tools can be used to monitor the EDEX processes, and for the purposes of this document and the Unidata AWIPS Training Workshop, it is assumed that all are available and that the user has some knowledge of how they are used. Regardless, this document includes the full command syntax that can be copy and pasted from the document to the terminal. ps - Display information about specific processes ps aux | grep edex cat - Used to display a text file in a terminal cat /awips2/ldm/etc/pqact.conf tail - Used to provide a dynamic picture of process logs tail -f /awips2/ldm/logs/ldmd.conf grep - Used to filter content of process logs; used to filter output of other tools grep edexBridge /awips2/ldm/etc/ldmd.conf top - Provides a dynamic view of the memory and cpu usage of the EDEX processes psql - A terminal-based front-end to PostgreSQL. We will be executing SQL queries. You do not need to have previous experience with SQL to follow this guide, but navigating AWIPS metadata is made much easier with some experience. [awips@edex ~]$ psql metadata Type \"help\" for help. metadata=# help You are using psql, the command-line interface to PostgreSQL. Type: \\copyright for distribution terms \\h for help with SQL commands \\? for help with psql commands \\g or terminate with semicolon to execute query \\q to quit metadata=# \\dt sat* List of relations Schema | Name | Type | Owner --------+-----------------------------------+-------+------- awips | satellite | table | awips awips | satellite_creating_entities | table | awips awips | satellite_geostationary_positions | table | awips awips | satellite_physical_elements | table | awips awips | satellite_sector_ids | table | awips awips | satellite_sources | table | awips awips | satellite_spatial | table | awips awips | satellite_units | table | awips (8 rows) metadata=# \\q","title":"Linux tools"},{"location":"devguide/regular-expressions/","text":"AWIPS uses regular expressions for data filtering at two steps in the ingest process: the LDM uses regular expressions to determine which data to write to /awips2/data_store /. An example for radars products defined in /awips2/ldm/etc/pqact.conf NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad The FILE option determines the actions on the product, in this case the name of the file (using \\n numeration) as determined by the values captured inside of parentheses ( read more about LDM pattern actions... ) EDEX Ingest uses regular expressions to determine routing of raw data to decoder plug-ins based on WMO header and file name ( Read more about WMO headers... ). An example for products defined in /awips2/edex/data/utility/edex_static/base/distribution/radar.xml ^SDUS[234578]. . ^Level2_. ^Level3_.* Standard LDM regular expressions from /awips2/ldm/etc/pqact.conf Level 3 Radar (All) \uf0c1 NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad Level 3 Radar (Subset) \uf0c1 NEXRAD3 ^(SDUS[23578].) .... (......) /p(DHR|DPR|DSP|DTA|DAA|DU3|DU6|DVL|EET|HHC|N3P|N0C|N0K|N0Q|N0S|N0U|N0X|N0Z|NCR|NMD|OHA)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad FNEXRAD Composites \uf0c1 FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....) FILE -close -edex /awips2/data_store/sat/nexrcomp_\\3\\4_\\2.gini.png Satellite Imagery \uf0c1 # NOAAPORT GINI images NIMAGE ^satz/ch[0-9]/.*/(.*)/([12][0-9])([0-9][0-9])([01][0-9])([0-3][0-9]) ([0-2][0-9])([0-5][0-9])/(.*)/(.*km)/ FILE -close -overwrite -edex /awips2/data_store/sat/\\8/\\9/\\1_\\2\\3\\4\\5_\\6\\7 # -------- GOES-East/West Northern Hemisphere Composites -------- # GOES-East/West VIS composites UNIWISC ^pnga2area Q. (CV) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_VIS_VIS_\\6_\\7 # GOES-East/West 3.9 um composites UNIWISC ^pnga2area Q. (CS) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_3.9_3.9_\\6_\\7 # GOES-East/West WV composites UNIWISC ^pnga2area Q. (CW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_WV_WV_\\6_\\7 # GOES-East/West IR composites UNIWISC ^pnga2area Q. (CI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_IR_IR_\\6_\\7 # GOES-East/West 13.3 um composites UNIWISC ^pnga2area Q. (CL) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_13.3_13.3_\\6_\\7 # ------------------- SSEC Global Composites ------------------- # Global WV composite UNIWISC ^pnga2area Q. (GW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_WV_WVCOMP_\\6_\\7 # Global IR composites UNIWISC ^pnga2area Q. (GI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_IR_IRCOMP_\\6_\\7 # ----------------- Mollweide Global Composites ----------------- # Mollweide Global Water Vapor UNIWISC ^pnga2area Q. (UY) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_WV_MOLLWV_\\6_\\7 # Mollweide Global IR UNIWISC ^pnga2area Q. (UX) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_IR_MOLLIR_\\6_\\7 # These work # GOES Visible (UV 4km VIS disabled) UNIWISC ^pnga2area Q. (EV|U9) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_VIS_\\4_\\6_\\7 # GOES Water Vapor UNIWISC ^pnga2area Q. (UW|UB) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_WV_\\4_\\6_\\7 # GOES Thermal Infrared UNIWISC ^pnga2area Q. (UI|U5) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_IR_\\4_\\6_\\7 # GOES other UNIWISC ^pnga2area Q. (UD|UE|U7|U8|) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_\\4_\\6_\\7 # Arctic UNIWISC ^pnga2area Q. (U[LNGHO]) (.*) (.*) (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ARCTIC_4km_\\4_\\6_\\7 # Antarctic VIS Composite UNIWISC ^pnga2area Q. (UJ) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_VIS_\\3_\\4_\\6_\\7 # Antarctic PCOL Composite UNIWISC ^pnga2area Q. (UK) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_PCOL_\\3_\\4_\\6_\\7 # Antarctic WV Composite UNIWISC ^pnga2area Q. (UF) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_WV_\\3_\\4_\\6_\\7 # Antarctic Composite IR UNIWISC ^pnga2area Q. (U1) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_IR_\\3_\\4_\\6_\\7 # GOES Sounder Derived Image Products from University of Wisconsin CIMSS # CIMSS CAPE - McIDAS product code CE UNIWISC ^pnga2area Q0 CE .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CAPE_\\4_\\5 # CIMSS Cloud Top Pressure - McIDAS product code CA UNIWISC ^pnga2area Q0 CA .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CTP_\\4_\\5 # CIMSS Lifted Index - McIDAS product code CD UNIWISC ^pnga2area Q0 CD .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_LI_\\4_\\5 # CIMSS Ozone - McIDAS product code CF UNIWISC ^pnga2area Q0 CF .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_OZONE_\\4_\\5 # CIMSS Total Column Precipitable Water - McIDAS product code CB UNIWISC ^pnga2area Q0 CB .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_PW_\\4_\\5 # CIMSS Sea Surface Temperature - McIDAS product code CC UNIWISC ^pnga2area Q0 CC .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_SST_\\4_\\5 # CIMSS Northern Hemisphere Wildfire ABBA - McIDAS product code CG (inactive) UNIWISC ^pnga2area Q0 CG (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESNH_\\4_\\5 # CIMSS Southern Hemisphere Wildfire ABBA - McIDAS product code CH (inactive) UNIWISC ^pnga2area Q0 CH (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESSH_\\4_\\5 Gridded Model Data \uf0c1 # GFS 0.5 deg (gfs.tCCz.pgrb2.0p50.fFFF) all hours out to F384 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.(pgrb2.0p50).*!(grib2)/[^/]*/(SSIGFS|GFS)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/GFS/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NAM-40km (awip3d) - exclude awip12 = NAM12 since it is on NGRID (exclude NAM 90km) CONDUIT ^data/nccf/com/nam/.*nam.*(awip3d).*!(grib2)/ncep/(NAM_84)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/\\3/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NOAAport HRRR NGRID Y.C.[0-9][0-9] KWBY ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/HRRR/\\1_F\\2_\\3_(seq).grib2 # GFS40 40km NGRID ^[LM].R... KWBC ...... !grib2/[^/]*/([^/]*)/#(212)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RAP-13km NGRID ^[LM].D... KWBG ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA 197 (5km) NGRID ^[LM].M... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA-Mosaic 2.5km (I) and URMA2.5 (Q) NGRID ^[LM].[IQ]... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NamDNG 2.5 and 5km NGRID ^[LM].[IM]... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NAM12 (#218) NGRID ^[LM].B... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # GEM 000 CMC_reg_USWRF_NTAT_0_ps15km_2015042818_P003.grib2 CMC CMC_reg_(.*)km_(..........)_P(...).grib2 FILE -overwrite -log -close -edex /awips2/data_store/grib2/cmc/cmc_reg_\\1km_\\2_P\\3.grib2 # FNMOC FNMOC ^US058.*(0018_0056|0022_0179|0027_0186|0060_0188|0063_0187|0110_0240|0111_0179|0135_0240|0078_0200)_(.*)_(.*)_(.*)-.* FILE -log -overwrite -close -edex /awips2/data_store/grib2/fnmoc/US_058_\\1_\\2_\\3_\\4-(seq).grib","title":"Regular expressions"},{"location":"devguide/regular-expressions/#level-3-radar-all","text":"NEXRAD3 ^(SDUS[23578].) .... (......) /p(...)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad","title":"Level 3 Radar (All)"},{"location":"devguide/regular-expressions/#level-3-radar-subset","text":"NEXRAD3 ^(SDUS[23578].) .... (......) /p(DHR|DPR|DSP|DTA|DAA|DU3|DU6|DVL|EET|HHC|N3P|N0C|N0K|N0Q|N0S|N0U|N0X|N0Z|NCR|NMD|OHA)(...) FILE -overwrite -close -edex /awips2/data_store/radar/\\4/\\3/\\1_\\4_\\3_\\2_(seq).rad","title":"Level 3 Radar (Subset)"},{"location":"devguide/regular-expressions/#fnexrad-composites","text":"FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....) FILE -close -edex /awips2/data_store/sat/nexrcomp_\\3\\4_\\2.gini.png","title":"FNEXRAD Composites"},{"location":"devguide/regular-expressions/#satellite-imagery","text":"# NOAAPORT GINI images NIMAGE ^satz/ch[0-9]/.*/(.*)/([12][0-9])([0-9][0-9])([01][0-9])([0-3][0-9]) ([0-2][0-9])([0-5][0-9])/(.*)/(.*km)/ FILE -close -overwrite -edex /awips2/data_store/sat/\\8/\\9/\\1_\\2\\3\\4\\5_\\6\\7 # -------- GOES-East/West Northern Hemisphere Composites -------- # GOES-East/West VIS composites UNIWISC ^pnga2area Q. (CV) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_VIS_VIS_\\6_\\7 # GOES-East/West 3.9 um composites UNIWISC ^pnga2area Q. (CS) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_3.9_3.9_\\6_\\7 # GOES-East/West WV composites UNIWISC ^pnga2area Q. (CW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_WV_WV_\\6_\\7 # GOES-East/West IR composites UNIWISC ^pnga2area Q. (CI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_IR_IR_\\6_\\7 # GOES-East/West 13.3 um composites UNIWISC ^pnga2area Q. (CL) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GEWCOMP_\\5_13.3_13.3_\\6_\\7 # ------------------- SSEC Global Composites ------------------- # Global WV composite UNIWISC ^pnga2area Q. (GW) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_WV_WVCOMP_\\6_\\7 # Global IR composites UNIWISC ^pnga2area Q. (GI) (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_GLOBAL_\\5_IR_IRCOMP_\\6_\\7 # ----------------- Mollweide Global Composites ----------------- # Mollweide Global Water Vapor UNIWISC ^pnga2area Q. (UY) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_WV_MOLLWV_\\6_\\7 # Mollweide Global IR UNIWISC ^pnga2area Q. (UX) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_MOLLWEIDE_30km_IR_MOLLIR_\\6_\\7 # These work # GOES Visible (UV 4km VIS disabled) UNIWISC ^pnga2area Q. (EV|U9) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_VIS_\\4_\\6_\\7 # GOES Water Vapor UNIWISC ^pnga2area Q. (UW|UB) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_WV_\\4_\\6_\\7 # GOES Thermal Infrared UNIWISC ^pnga2area Q. (UI|U5) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_IR_\\4_\\6_\\7 # GOES other UNIWISC ^pnga2area Q. (UD|UE|U7|U8|) (.*) (.*)_IMG (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_\\1_\\3_\\5_\\4_\\6_\\7 # Arctic UNIWISC ^pnga2area Q. (U[LNGHO]) (.*) (.*) (.*)um (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ARCTIC_4km_\\4_\\6_\\7 # Antarctic VIS Composite UNIWISC ^pnga2area Q. (UJ) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_VIS_\\3_\\4_\\6_\\7 # Antarctic PCOL Composite UNIWISC ^pnga2area Q. (UK) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_PCOL_\\3_\\4_\\6_\\7 # Antarctic WV Composite UNIWISC ^pnga2area Q. (UF) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_WV_\\3_\\4_\\6_\\7 # Antarctic Composite IR UNIWISC ^pnga2area Q. (U1) (.*) (.*)_IMG (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_ANTARCTIC_4km_IR_\\3_\\4_\\6_\\7 # GOES Sounder Derived Image Products from University of Wisconsin CIMSS # CIMSS CAPE - McIDAS product code CE UNIWISC ^pnga2area Q0 CE .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CAPE_\\4_\\5 # CIMSS Cloud Top Pressure - McIDAS product code CA UNIWISC ^pnga2area Q0 CA .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_CTP_\\4_\\5 # CIMSS Lifted Index - McIDAS product code CD UNIWISC ^pnga2area Q0 CD .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_LI_\\4_\\5 # CIMSS Ozone - McIDAS product code CF UNIWISC ^pnga2area Q0 CF .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_OZONE_\\4_\\5 # CIMSS Total Column Precipitable Water - McIDAS product code CB UNIWISC ^pnga2area Q0 CB .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_PW_\\4_\\5 # CIMSS Sea Surface Temperature - McIDAS product code CC UNIWISC ^pnga2area Q0 CC .... (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_SST_\\4_\\5 # CIMSS Northern Hemisphere Wildfire ABBA - McIDAS product code CG (inactive) UNIWISC ^pnga2area Q0 CG (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESNH_\\4_\\5 # CIMSS Southern Hemisphere Wildfire ABBA - McIDAS product code CH (inactive) UNIWISC ^pnga2area Q0 CH (.*) (.*) (.*) (.*) (........) (....) PIPE -close -log pnga2area -vl logs/pnga2area.log /awips2/data_store/ingest/uniwisc_SOUNDER_\\3_FIRESSH_\\4_\\5","title":"Satellite Imagery"},{"location":"devguide/regular-expressions/#gridded-model-data","text":"# GFS 0.5 deg (gfs.tCCz.pgrb2.0p50.fFFF) all hours out to F384 CONDUIT ^data/nccf/com/.*gfs.t[0-9][0-9]z.(pgrb2.0p50).*!(grib2)/[^/]*/(SSIGFS|GFS)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/GFS/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NAM-40km (awip3d) - exclude awip12 = NAM12 since it is on NGRID (exclude NAM 90km) CONDUIT ^data/nccf/com/nam/.*nam.*(awip3d).*!(grib2)/ncep/(NAM_84)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-1]..)/([^/]*)/.*! (......) FILE -overwrite -log -close -edex /awips2/data_store/grib2/conduit/\\3/\\5_\\6Z_\\7_\\8-(seq).\\1.grib2 # NOAAport HRRR NGRID Y.C.[0-9][0-9] KWBY ...... !grib2/[^/]*/[^/]*/#[^/]*/([0-9]{12})F(...)/(.*)/.* FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/HRRR/\\1_F\\2_\\3_(seq).grib2 # GFS40 40km NGRID ^[LM].R... KWBC ...... !grib2/[^/]*/([^/]*)/#(212)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RAP-13km NGRID ^[LM].D... KWBG ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA 197 (5km) NGRID ^[LM].M... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # RTMA-Mosaic 2.5km (I) and URMA2.5 (Q) NGRID ^[LM].[IQ]... KWBR ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NamDNG 2.5 and 5km NGRID ^[LM].[IM]... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # NAM12 (#218) NGRID ^[LM].B... KWBE ...... !grib2/[^/]*/([^/]*)/#([^/]*)/([0-9]{8})([0-9]{4})(F[0-9]{3})/([^/]*) FILE -overwrite -log -close -edex /awips2/data_store/grib2/noaaport/GRID\\2/\\1_\\3_\\4Z_\\5_\\6-(seq).grib2 # GEM 000 CMC_reg_USWRF_NTAT_0_ps15km_2015042818_P003.grib2 CMC CMC_reg_(.*)km_(..........)_P(...).grib2 FILE -overwrite -log -close -edex /awips2/data_store/grib2/cmc/cmc_reg_\\1km_\\2_P\\3.grib2 # FNMOC FNMOC ^US058.*(0018_0056|0022_0179|0027_0186|0060_0188|0063_0187|0110_0240|0111_0179|0135_0240|0078_0200)_(.*)_(.*)_(.*)-.* FILE -log -overwrite -close -edex /awips2/data_store/grib2/fnmoc/US_058_\\1_\\2_\\3_\\4-(seq).grib","title":"Gridded Model Data"},{"location":"edex/archiver/","text":"Grant Users Permission to Create Case Study Archives \uf0c1 The file /awips2/edex/data/utility/common_static/base/roles/archiveAdminRoles.xml controls which users can run the archiving tools from CAVE. Data Archiving This permission allows the user to access Archive Retention. This permission allows the user to access Archive Case Creation. archive.retention archive.casecreation will allow any connected CAVE user to run both the Archive Retention and the Archive Case Creation tools. If you want to control access to individual users, such as the example bwlo, which will allow any user to create case studies, but only the awips user to run the Archive Retention tool. archive.retention archive.casecreation Define EDEX User Administration Roles \uf0c1 Admins can use the CAVE User Administration interface to manage user access roles. The file /awips2/edex/data/utility/common_static/base/roles/awipsUserAdminRoles.xml controls access to this tool. User Administration This permission allows the user to access and edit AWIPS 2 User Administration awips.user.admin EDEX Archiver \uf0c1 /awips2/edex/conf/resources/com.raytheon.uf.edex.archive.cron.properties # enable archive archive.enable=false # runs database and hdf5 archive for archive server to pull data from archive.cron=0+40+*+*+*+? # path to store processed archive data archive.path=/awips2/archive # enable archive purge archive.purge.enable=true # when to purge archives archive.purge.cron=0+5+0/2+*+*+? # compress database records archive.compression.enable=false # To change Default case directory. #archive.case.directory=/awips2/edex/data/archiver/ # to disable a specific archive, use property archive.disable=pluginName,pluginName... #archive.disable=grid,text,acars The EDEX Archiver plugin can be used to automate data backup or create case study archive files to be retained by EDEX. The file /awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml controls which products are ardhived, and how. Archive Log \uf0c1 The file /awips2/edex/logs/edex-ingest-archive-*.log will report status of the archiver whenever it is run. With regular archiving disabled (by default) will see messages such as INFO 2016-11-30 09:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 10:40:00,009 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 11:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 12:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting /awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml \uf0c1 , , , and are the four tags which configure the EDEX Archiver. Processed /awips2/archive/ 168 Model 168 (grid)/(.*)/(.*)/.*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})-.* {2} 4,5,6,7 (modelsounding)/(.*)/.*/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}).* (bufrmos)(.*)/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}) {1} - {2} 3,4,5,6 is used as a logical grouping of the archive sub-directories, and contains the following tags: - The id for the category, used in CAVE. - Optional. The hours to retain data in directories of selected Data Sets for a category. Default is 1 hour. - A directory matching . These are selected directories from the Retention GUI. The purger will used the category's instead of the Arhivie's . An Optional. may have more then one. (NOTE these are set internally when a selection configuration file is loaded. They should not appear in the configuration file.) - Required to have a least one. Each one contains a set of tags explained below. The tag contains: - A regex pattern for finding directories for this category. Required to have at least one. The pattern is relative to . Wildcard patterns do not cross directory delimiter / . Thus to match 3 levels of directories you would need .*/.*/.* (see patterns and groups section). There may be more then one in a , but they must all have the same number of groupings and be in the same order to match up with , and . - Optional. A pattern to find files in the directories that match . Default is everything in the directories that match . See patterns and groups section. - The label to display for directories that match . Any group in the may be made part of the label by placing the group's index inside parenthesis, {1} . More then one directory may match the . The archive GUIs may collapse them into a single table entry. - Optional tag to determine what type of time stamp is being used to get files/directories for retention and case creation. The value dictates how many groupings in the s and/or are used to get the time stamp for a file. The five values are: Date - (default) the time stamp is made up of 3 or 4 groups in the patterns: year , month , day and (optional) hour . Julian - Time stamp is made up of 2 or 3 groups in the patterns: year , day_of_year and (optional) hour . EpochSec - Time stamp epoch time in seconds. EpochMS - Time stamp epoch time in milliseconds. File - Instead use the files date of last modification. No group is used to get the time stamp. - Required tag when has any value but File . Date - A comma separated list of 3 or 4 numbers which are in order the index for year , month , day and hour . When only 3 numbers the hour is value is 23. Julian - A comma separated list of 2 or 3 numbers which are in order the index for year , day of year , and hour . When only two numbers the hour value is 23. EpochSec - A number which is the index for the epoch in seconds. EpochMS - A number which is the index for the epoch in milliseconds. File - Not needed since no group is used to get the time stamp. This is used to determine what files/directories to retain or a range of directories/files to copy for case creation. Note to get the group's index the and are combined. Thus if there are 5 groups in the then the first group in the is index 6. ## Patterns and groups. and use Java regex expressions , similar to the ldm's pqact.conf file. The groupings index start at one. The groups in the can be used in the . For example: (grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*) {1} - {6} 2,3,4,5 contains six groups. The first group is the literal grib2 which matches only a directory named grib2 that is a sub-directory of the . The groups 2, 3 and 4 break apart the next level of sub-directories into a 4 digit and two 2 digit groups. This is the expected year , month , day sub-subdirectory indicated by the first 3 entries in . The next sub-directory contains the fifth group which is a two digit number representing the hour. Finally the sixth group will match any sub-directory that in the hour directory. Thus the directory paths /grib2/20130527/18/GFS will generate the display string, grib2 - GFS , and from the grouping we can find the year, 2013 ; month, 05 ; day, 27 and hour, 18 . Example with \uf0c1 hdf5/(redbook) {1} redbook-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})\\..* 2,3,4,5 Example with multiple \uf0c1 Observation 168 (acars|airep|airmet|taf) (bufrsigwx|sfcobs)/.* {1} Date 2,3,4,5 .*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})\\..* The first looks for files matching the in the directories acars , airep , airmet or taf . The second expects to find the files in subdirectories of bufrsigwx or sfcobs such as bufrsigwx/SWH . Here the display will only show, redbook. The directory looked at will be /redbook/ . The all come from the since there is one group in the the groups in the start at two. This matches file names redbook-YYYY-MM-DD-HH. . Thus the file name redbook-2013-05-28-00.hd5 would match the . NOTE group {0} is a string that matches the whole . If this is used in the would see every directory that matches the pattern.","title":"Grant Users Permission to Create Case Study Archives"},{"location":"edex/archiver/#grant-users-permission-to-create-case-study-archives","text":"The file /awips2/edex/data/utility/common_static/base/roles/archiveAdminRoles.xml controls which users can run the archiving tools from CAVE. Data Archiving This permission allows the user to access Archive Retention. This permission allows the user to access Archive Case Creation. archive.retention archive.casecreation will allow any connected CAVE user to run both the Archive Retention and the Archive Case Creation tools. If you want to control access to individual users, such as the example bwlo, which will allow any user to create case studies, but only the awips user to run the Archive Retention tool. archive.retention archive.casecreation ","title":"Grant Users Permission to Create Case Study Archives"},{"location":"edex/archiver/#define-edex-user-administration-roles","text":"Admins can use the CAVE User Administration interface to manage user access roles. The file /awips2/edex/data/utility/common_static/base/roles/awipsUserAdminRoles.xml controls access to this tool. User Administration This permission allows the user to access and edit AWIPS 2 User Administration awips.user.admin ","title":"Define EDEX User Administration Roles"},{"location":"edex/archiver/#edex-archiver","text":"/awips2/edex/conf/resources/com.raytheon.uf.edex.archive.cron.properties # enable archive archive.enable=false # runs database and hdf5 archive for archive server to pull data from archive.cron=0+40+*+*+*+? # path to store processed archive data archive.path=/awips2/archive # enable archive purge archive.purge.enable=true # when to purge archives archive.purge.cron=0+5+0/2+*+*+? # compress database records archive.compression.enable=false # To change Default case directory. #archive.case.directory=/awips2/edex/data/archiver/ # to disable a specific archive, use property archive.disable=pluginName,pluginName... #archive.disable=grid,text,acars The EDEX Archiver plugin can be used to automate data backup or create case study archive files to be retained by EDEX. The file /awips2/edex/data/utility/common_static/base/archiver/purger/PROCESSED_DATA.xml controls which products are ardhived, and how.","title":"EDEX Archiver"},{"location":"edex/archiver/#archive-log","text":"The file /awips2/edex/logs/edex-ingest-archive-*.log will report status of the archiver whenever it is run. With regular archiving disabled (by default) will see messages such as INFO 2016-11-30 09:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 10:40:00,009 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 11:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting INFO 2016-11-30 12:40:00,010 [Archiver] DataArchiver: EDEX - Archival of plugin data disabled, exiting","title":"Archive Log"},{"location":"edex/archiver/#awips2edexdatautilitycommon_staticbasearchiverpurgerprocessed_dataxml","text":" , , , and are the four tags which configure the EDEX Archiver. Processed /awips2/archive/ 168 Model 168 (grid)/(.*)/(.*)/.*-(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2})-.* {2} 4,5,6,7 (modelsounding)/(.*)/.*/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}).* (bufrmos)(.*)/.*(\\d{4})-(\\d{2})-(\\d{2})-(\\d{2}) {1} - {2} 3,4,5,6 is used as a logical grouping of the archive sub-directories, and contains the following tags: - The id for the category, used in CAVE. - Optional. The hours to retain data in directories of selected Data Sets for a category. Default is 1 hour. - A directory matching . These are selected directories from the Retention GUI. The purger will used the category's instead of the Arhivie's . An Optional. may have more then one. (NOTE these are set internally when a selection configuration file is loaded. They should not appear in the configuration file.) - Required to have a least one. Each one contains a set of tags explained below. The tag contains: - A regex pattern for finding directories for this category. Required to have at least one. The pattern is relative to . Wildcard patterns do not cross directory delimiter / . Thus to match 3 levels of directories you would need .*/.*/.* (see patterns and groups section). There may be more then one in a , but they must all have the same number of groupings and be in the same order to match up with , and . - Optional. A pattern to find files in the directories that match . Default is everything in the directories that match . See patterns and groups section. - The label to display for directories that match . Any group in the may be made part of the label by placing the group's index inside parenthesis, {1} . More then one directory may match the . The archive GUIs may collapse them into a single table entry. - Optional tag to determine what type of time stamp is being used to get files/directories for retention and case creation. The value dictates how many groupings in the s and/or are used to get the time stamp for a file. The five values are: Date - (default) the time stamp is made up of 3 or 4 groups in the patterns: year , month , day and (optional) hour . Julian - Time stamp is made up of 2 or 3 groups in the patterns: year , day_of_year and (optional) hour . EpochSec - Time stamp epoch time in seconds. EpochMS - Time stamp epoch time in milliseconds. File - Instead use the files date of last modification. No group is used to get the time stamp. - Required tag when has any value but File . Date - A comma separated list of 3 or 4 numbers which are in order the index for year , month , day and hour . When only 3 numbers the hour is value is 23. Julian - A comma separated list of 2 or 3 numbers which are in order the index for year , day of year , and hour . When only two numbers the hour value is 23. EpochSec - A number which is the index for the epoch in seconds. EpochMS - A number which is the index for the epoch in milliseconds. File - Not needed since no group is used to get the time stamp. This is used to determine what files/directories to retain or a range of directories/files to copy for case creation. Note to get the group's index the and