This project was born at 02.08.2018 and a first working prototype was made up for public at 09.08.2018. It is still really imperfect and not performant. But it works! I hope you will enjoy the work and maybe have ideas on how to improve the project.
The current setup of this scripts is made for one Nuki Web Account which may include one or more Smartlock. In case that there are many Smartlocks under one Web Account it should grab information from all Locks automatically. If you want to monitor more than one Nuki Web Account you will need to setup the same script architecture more than once. The same is for the Nuki Bridge. Each bridge will require its own script set. BUT: You only need one database! You may use the same database for everything but if you like you also can add a datasource per Smartlock.
## 4. Prerequisites
### 4.1 JQ JSON parser
JQ is a really nice tool to parse JSON data. It also allows to export/convert data to other file formats. In this case we use the csv filter from JQ. By the way you can use https://jqplay.org/ to test different syntax without the need of a SSH shell
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE SmartlockLog; LOAD DATA LOCAL INFILE '"$FILE_BASE"_SmartlockLog.csv' INTO TABLE SmartlockLog FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
eval "curl -X GET $CURL_HEADER $BASE_URL/smartlock/auth -o "$FILE_BASE"_SmartlockAuth.json"
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE SmartlockAuth; LOAD DATA LOCAL INFILE '"$FILE_BASE"_SmartlockAuth.csv' INTO TABLE SmartlockAuth FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
eval "curl -X GET $CURL_HEADER $BASE_URL/smartlock -o "$FILE_BASE"_Smartlock.json"
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE Smartlock; LOAD DATA LOCAL INFILE '"$FILE_BASE"_Smartlock.csv' INTO TABLE Smartlock FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
Because my Database (Grafana Datasource) is on another network than the Nuki Bridge i needed to split up the installation. That means that i grab the json file from Nuki Bridge from a Raspberry Pi which is on the same network. By using some SSH user with public key authentication i am able to pull the data via Odroid.
#### 5.2.1 On the collector node side
The following scripts are placed on a server which is on the same network as the Nuki Bridge. You might open the port 8080 as well via firewall but i did not want to do this because the basic Nuki token is really short and unsafe (only 6 chars). The script will collect data from bridge and save it to /opt/nuki/bridge. A rsync script will grab this data.
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE Table BridgeInfo; LOAD DATA LOCAL INFILE '"$FILE_BASE"_info.csv' INTO TABLE BridgeInfo FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeList; LOAD DATA LOCAL INFILE '"$FILE_BASE"_list.csv' INTO TABLE BridgeList FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeLockState; LOAD DATA LOCAL INFILE '"$FILE_BASE"_lockState.csv' INTO TABLE BridgeLockState FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
#will return "Cannot iterate over null (null)" if file is empty!
jq -r ".|{callbacks}|\"$NUKIID\" as \$a|.callbacks[].nukiId=\$a|.[]|$JQ_CSV" "$FILE_BASE"_callback-list.json > "$FILE_BASE"_callback-list.csv
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeCallbackList; LOAD DATA LOCAL INFILE '"$FILE_BASE"_callback-list.csv' INTO TABLE BridgeCallbackList FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
mysql -u$DB_USER -p$DB_PASS $DB_NAME -e"TRUNCATE TABLE BridgeLog; LOAD DATA LOCAL INFILE '"$FILE_BASE"_log.csv' INTO TABLE BridgeLog FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES;"
* by adjusting the bash scripts you can modify the update frequency for CURLing and parsing json files (currently 30s is standard value)
* Bridge Id has to be added so the conjunction between Smartlock and Bridge can be established. This could be automated but at the moment it has to be done manually (replace in JQ parsing lines)
* you can customize offset and count for BridgeLog by redefining argument
## 6. Download Grafana Dashboard
https://grafana.com/dashboards/7628/ or https://leyghis.fablabchemnitz.de:8444/MarioVoigt/Nukiana/src/branch/master/Grafana/Nuki%20Smartlock%20+%20Bridge%20%28Internals%29.json
## 7. Remaining ToDo's
* translation
* at the moment the code is in english and the dashboard is only available in german
* possible solutions:
* add a language drop down variable to Grafana + rewrite every select so strings get automatically translated → decimal separators should be rewritten too
* add extra translation columns to MariaDB tables + some kind of translation string repository to fill the columns → could be utilized by DB triggers/routines
* rewrite curl commands so that the log files (SmartlockLog, BridgeLog) are getting streamed until the point (Id) where the last entry was loaded previously to ...
* reduce bandwith
* speed up the process
* make the "sed" script lines nicer → i guess they can be completely removed by using JQ
* BridgeLog: "handles" are not imported and are getting ignored
* Bridge provides timestamp but no timezoneOffset → SQL statements were written to use the timezoneOffset of Smartlock instead → works, but dirty
* replace TRUNCATE statements with "INSERT ON DUPLICATE KEY UPDATE" to leave old data but import new data
* build up some check script to make proof of correct count of columns and column order of the csv files
* collect data with time history:
* history of firmware changes
* history of rssi values
* history of state changes: locking states, pairing state
* to test: this project was never tested with more than one Smartlock and more than one Bridge. We only have one! I also did not test what data output FOB will produce
* Maybe give a try to send the JSON output files directly to JQ/Filebeat → Elasticsearch → Grafana Datasource for Elasticsearch ↔ possible?
* Write some data collector tables like "BridgeFirmwareHistory" and "SmartlockFirmwareHistory" for tracking firmware changes utilizing some trigger