############################################################################### Influx/Grafana plotting Copy data from the POC eng database to an influx time-series database, then plot/analyze the data using tools such as chronograf or grafana. Examples: --------------------------------------------------------------------- eng2influx sync --source tess.mit.edu/engdb_test --destination https://tessts.mit.edu/engdb eng2influx mnemonics --source tess.mit.edu/engdb_test --sensor rx eng2influx utc2tjd UTC: 1510763574.53 (2017.11.15T16:32:54) TJD: 1073.1902976 eng2influx utc2tjd --time 1073.1902976 UTC: 1510763593.18 (2017.11.15T16:33:13) TJD: 1073.19051344 Profiling: -------------------------------------------------------------------- Sync of engdb_test data as of 15nov2017 took about 40 minutes for all sensors. Usage: ------------------------------------------------------------------------ Usage: eng2influx action [options] Actions: setup - configure a timeseries database sync - copy data from eng database to timeseries database mnemonics - display the sensor mnemonics tjd2utc - convert TESS julian date to UTC utc2tjd - convert UTC to TESS julian date Options: -h, --help show this help message and exit --version display the version --debug emit additional diagnostic information --log=LOG_FILE log file --sensor=MNEMONIC_FILTER which sensor(s) to copy --source=DSN source database --destination=DSN destination database --start=START_TIME start time as UTC YYYY.mm.ddTHH:MM:SS or TJD xx.x --end=END_TIME end time as UTC YYYY.mm.ddTHH:MM:SS or TJD xx.x --time=TIME timestamp as UTC YYYY.mm.ddTHH:MM:SS or TJD xx.x --dry-run show what would be done but do not do it Installation: ----------------------------------------------------------------- make install [PREFIX=/path/to/install] ############################################################################### Basic HTML plotting This is a utility to plot time series data from a CSV file and apply a linear curve fit to the data. Usage: 1) Get a dump of data from the sensor database outcsv [ -a ][ --start X --end Y ] 2) Split the dump into separate csv files and generate the sensor label file. This will emit one csv file for each sensor into the odir, and it will emit a sensor-list file that the web page uses to recognize the sensors. splitcsv.pl --ifile data.csv 3) Point a web browser at the file data.html. Be sure to read the 'Caveats' section below. open data.html Testing: Use the included test data to verify that everything works. splitcsv.pl --ifile test/fpe_data.csv Optionally extract a subset of data splitcsv.pl --ifile data.csv --filter "sensorA,sensorB,sensorC" Requirements: Look at particular HK/ancillary data; plot trends and fit, show a pretty report with the TESS logo in the ULH corner. There should be units on the axes; autoscaling should be fine for now. Proposed design: Single web page with javascript-based plots of sensor timeseries data. Details: This implementation uses dygraphs, a javascript plotting package based on d3 that handles large data sets very nicely. data.html - the page that displays sensor data data/ - destination for individual csv files splitcsv.pl - utility to split a single csv into one file per sensor dygraph-x.x.x.css - default css for dygraph dygraph-x.x.x.js - the dygraph code dygraph-x.x.x.min.js - the minified dygraph code test/fpe_data.csv - sample data from GSIT3 dry run 2017 Caveats: Modern web browsers do not let you view data from local disk - they default to spitting out an error about 'cross-origin request' problems. To work around this, you have two options: option 1) run a web server, then load data.html from the web server to run a simple python web server: python -m SimpleHTTPServer 8000 option 2) start chrome with the --allow-file-access-from-files option on macosx, do it like this: open -a 'Google Chrome' --args -allow-file-access-from-files on linux, do it something like this: chrome --allow-file-access-from-files on windows, do it like this: chrome.exe --allow-file-access-from-files